Dec 02 07:45:51 crc systemd[1]: Starting Kubernetes Kubelet... Dec 02 07:45:51 crc restorecon[4690]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:51 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 02 07:45:52 crc restorecon[4690]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 02 07:45:52 crc kubenswrapper[4691]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:45:52 crc kubenswrapper[4691]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 02 07:45:52 crc kubenswrapper[4691]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:45:52 crc kubenswrapper[4691]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:45:52 crc kubenswrapper[4691]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 02 07:45:52 crc kubenswrapper[4691]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.362037 4691 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365838 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365858 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365865 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365870 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365876 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365883 4691 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365888 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365895 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365900 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365905 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365910 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365916 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365921 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365926 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365931 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365936 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365942 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365947 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365952 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365958 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365963 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365968 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365973 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365981 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365987 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365992 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.365997 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366002 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366008 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366012 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366017 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366022 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366027 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366033 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366037 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366042 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366047 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366052 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366057 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366062 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366067 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366071 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366076 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366084 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366091 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366097 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366103 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366109 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366115 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366121 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366126 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366131 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366137 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366142 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366147 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366151 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366156 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366161 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366165 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366174 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366180 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366185 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366191 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366196 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366201 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366206 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366211 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366216 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366221 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366227 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.366233 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366554 4691 flags.go:64] FLAG: --address="0.0.0.0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366568 4691 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366575 4691 flags.go:64] FLAG: --anonymous-auth="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366580 4691 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366585 4691 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366590 4691 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366595 4691 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366601 4691 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366605 4691 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366611 4691 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366615 4691 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366620 4691 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366624 4691 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366628 4691 flags.go:64] FLAG: --cgroup-root="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366632 4691 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366636 4691 flags.go:64] FLAG: --client-ca-file="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366640 4691 flags.go:64] FLAG: --cloud-config="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366644 4691 flags.go:64] FLAG: --cloud-provider="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366648 4691 flags.go:64] FLAG: --cluster-dns="[]" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366654 4691 flags.go:64] FLAG: --cluster-domain="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366658 4691 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366662 4691 flags.go:64] FLAG: --config-dir="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366666 4691 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366670 4691 flags.go:64] FLAG: --container-log-max-files="5" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366675 4691 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366680 4691 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366684 4691 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366689 4691 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366693 4691 flags.go:64] FLAG: --contention-profiling="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366697 4691 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366701 4691 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366706 4691 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366710 4691 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366715 4691 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366720 4691 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366725 4691 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366729 4691 flags.go:64] FLAG: --enable-load-reader="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366733 4691 flags.go:64] FLAG: --enable-server="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366739 4691 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366744 4691 flags.go:64] FLAG: --event-burst="100" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366748 4691 flags.go:64] FLAG: --event-qps="50" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366753 4691 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366757 4691 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366761 4691 flags.go:64] FLAG: --eviction-hard="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366767 4691 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366774 4691 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366778 4691 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366794 4691 flags.go:64] FLAG: --eviction-soft="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366799 4691 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366803 4691 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366808 4691 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366812 4691 flags.go:64] FLAG: --experimental-mounter-path="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366817 4691 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366821 4691 flags.go:64] FLAG: --fail-swap-on="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366826 4691 flags.go:64] FLAG: --feature-gates="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366832 4691 flags.go:64] FLAG: --file-check-frequency="20s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366837 4691 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366842 4691 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366846 4691 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366850 4691 flags.go:64] FLAG: --healthz-port="10248" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366855 4691 flags.go:64] FLAG: --help="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366859 4691 flags.go:64] FLAG: --hostname-override="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366863 4691 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366868 4691 flags.go:64] FLAG: --http-check-frequency="20s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366872 4691 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366876 4691 flags.go:64] FLAG: --image-credential-provider-config="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366880 4691 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366884 4691 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366888 4691 flags.go:64] FLAG: --image-service-endpoint="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366892 4691 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366896 4691 flags.go:64] FLAG: --kube-api-burst="100" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366901 4691 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366905 4691 flags.go:64] FLAG: --kube-api-qps="50" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366910 4691 flags.go:64] FLAG: --kube-reserved="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366914 4691 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366918 4691 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366922 4691 flags.go:64] FLAG: --kubelet-cgroups="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366926 4691 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366930 4691 flags.go:64] FLAG: --lock-file="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366935 4691 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366939 4691 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366943 4691 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366949 4691 flags.go:64] FLAG: --log-json-split-stream="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366953 4691 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366957 4691 flags.go:64] FLAG: --log-text-split-stream="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366961 4691 flags.go:64] FLAG: --logging-format="text" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366965 4691 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366970 4691 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366974 4691 flags.go:64] FLAG: --manifest-url="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366978 4691 flags.go:64] FLAG: --manifest-url-header="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366984 4691 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366988 4691 flags.go:64] FLAG: --max-open-files="1000000" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366994 4691 flags.go:64] FLAG: --max-pods="110" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.366998 4691 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367002 4691 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367007 4691 flags.go:64] FLAG: --memory-manager-policy="None" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367011 4691 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367015 4691 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367019 4691 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367023 4691 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367033 4691 flags.go:64] FLAG: --node-status-max-images="50" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367037 4691 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367041 4691 flags.go:64] FLAG: --oom-score-adj="-999" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367046 4691 flags.go:64] FLAG: --pod-cidr="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367050 4691 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367057 4691 flags.go:64] FLAG: --pod-manifest-path="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367062 4691 flags.go:64] FLAG: --pod-max-pids="-1" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367066 4691 flags.go:64] FLAG: --pods-per-core="0" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367071 4691 flags.go:64] FLAG: --port="10250" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367075 4691 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367079 4691 flags.go:64] FLAG: --provider-id="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367083 4691 flags.go:64] FLAG: --qos-reserved="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367087 4691 flags.go:64] FLAG: --read-only-port="10255" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367091 4691 flags.go:64] FLAG: --register-node="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367095 4691 flags.go:64] FLAG: --register-schedulable="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367100 4691 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367106 4691 flags.go:64] FLAG: --registry-burst="10" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367110 4691 flags.go:64] FLAG: --registry-qps="5" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367115 4691 flags.go:64] FLAG: --reserved-cpus="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367119 4691 flags.go:64] FLAG: --reserved-memory="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367124 4691 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367128 4691 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367132 4691 flags.go:64] FLAG: --rotate-certificates="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367136 4691 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367141 4691 flags.go:64] FLAG: --runonce="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367145 4691 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367149 4691 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367153 4691 flags.go:64] FLAG: --seccomp-default="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367157 4691 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367161 4691 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367166 4691 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367170 4691 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367174 4691 flags.go:64] FLAG: --storage-driver-password="root" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367178 4691 flags.go:64] FLAG: --storage-driver-secure="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367182 4691 flags.go:64] FLAG: --storage-driver-table="stats" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367186 4691 flags.go:64] FLAG: --storage-driver-user="root" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367190 4691 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367194 4691 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367198 4691 flags.go:64] FLAG: --system-cgroups="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367202 4691 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367208 4691 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367212 4691 flags.go:64] FLAG: --tls-cert-file="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367217 4691 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367223 4691 flags.go:64] FLAG: --tls-min-version="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367227 4691 flags.go:64] FLAG: --tls-private-key-file="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367231 4691 flags.go:64] FLAG: --topology-manager-policy="none" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367235 4691 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367239 4691 flags.go:64] FLAG: --topology-manager-scope="container" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367243 4691 flags.go:64] FLAG: --v="2" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367249 4691 flags.go:64] FLAG: --version="false" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367254 4691 flags.go:64] FLAG: --vmodule="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367259 4691 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367264 4691 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367382 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367388 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367396 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367400 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367404 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367408 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367411 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367415 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367419 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367422 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367425 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367429 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367432 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367436 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367440 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367443 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367446 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367450 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367454 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367457 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367462 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367466 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367470 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367474 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367478 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367483 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367486 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367490 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367494 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367497 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367501 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367504 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367508 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367512 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367518 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367523 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367527 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367531 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367534 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367539 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367543 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367547 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367551 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367554 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367558 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367561 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367565 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367568 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367572 4691 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367575 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367580 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367584 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367588 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367591 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367595 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367599 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367602 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367606 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367609 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367613 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367616 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367621 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367626 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367630 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367634 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367638 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367643 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367647 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367650 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367654 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.367657 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.367663 4691 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.375605 4691 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.375643 4691 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375765 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375779 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375809 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375819 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375828 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375837 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375845 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375856 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375866 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375875 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375883 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375891 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375899 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375907 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375917 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375928 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375941 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375952 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375961 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375969 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375977 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375986 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.375996 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376007 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376017 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376027 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376037 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376047 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376056 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376065 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376075 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376087 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376096 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376106 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376119 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376132 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376142 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376154 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376163 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376174 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376184 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376194 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376204 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376214 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376224 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376237 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376249 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376261 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376272 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376282 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376293 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376303 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376315 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376324 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376334 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376343 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376353 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376361 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376369 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376377 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376385 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376393 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376401 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376413 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376421 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376431 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376440 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376447 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376455 4691 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376463 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376471 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.376485 4691 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376735 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376747 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376764 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376774 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376807 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376817 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376824 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376832 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376840 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376849 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376858 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376866 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376873 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376882 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376890 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376898 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376908 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376916 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376924 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376932 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376940 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376947 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376955 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376963 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376972 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376980 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376988 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.376996 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377003 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377011 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377019 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377029 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377039 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377048 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377058 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377067 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377076 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377084 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377092 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377101 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377110 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377118 4691 feature_gate.go:330] unrecognized feature gate: Example Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377126 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377134 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377142 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377150 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377161 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377171 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377179 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377187 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377198 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377208 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377218 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377227 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377236 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377244 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377254 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377264 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377272 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377280 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377288 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377296 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377304 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377312 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377319 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377327 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377337 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377347 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377355 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377362 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.377370 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.377382 4691 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.377927 4691 server.go:940] "Client rotation is on, will bootstrap in background" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.382023 4691 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.382157 4691 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.382991 4691 server.go:997] "Starting client certificate rotation" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.383020 4691 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.383519 4691 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-13 05:39:00.161937412 +0000 UTC Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.383642 4691 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 261h53m7.778300231s for next certificate rotation Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.391839 4691 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.397922 4691 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.408051 4691 log.go:25] "Validated CRI v1 runtime API" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.430452 4691 log.go:25] "Validated CRI v1 image API" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.433304 4691 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.437667 4691 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-02-07-41-43-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.437752 4691 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.470077 4691 manager.go:217] Machine: {Timestamp:2025-12-02 07:45:52.467206882 +0000 UTC m=+0.251285784 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:594c318f-f07e-4852-8352-7483c8d3d991 BootID:d3569d90-2e1c-4c42-8376-d393c2ea01f6 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ea:ae:25 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ea:ae:25 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:23:d6:0a Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:ed:a4:1c Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:57:3b:1b Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:31:b5:6c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:be:9a:ff:1e:f6:30 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:e6:65:5a:25:05:3c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.470535 4691 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.470751 4691 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.471420 4691 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.471748 4691 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.471858 4691 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.472182 4691 topology_manager.go:138] "Creating topology manager with none policy" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.472200 4691 container_manager_linux.go:303] "Creating device plugin manager" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.472728 4691 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.472809 4691 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.473374 4691 state_mem.go:36] "Initialized new in-memory state store" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.473533 4691 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.475166 4691 kubelet.go:418] "Attempting to sync node with API server" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.475217 4691 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.475277 4691 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.475321 4691 kubelet.go:324] "Adding apiserver pod source" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.475349 4691 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.478225 4691 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.478848 4691 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.481034 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.481216 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.481040 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.481282 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.482754 4691 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484650 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484728 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484750 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484810 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484862 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484947 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.484995 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485017 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485035 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485049 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485100 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485112 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485153 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.485736 4691 server.go:1280] "Started kubelet" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.487946 4691 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.488301 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.488368 4691 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.488667 4691 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 02 07:45:52 crc systemd[1]: Started Kubernetes Kubelet. Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.489491 4691 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187d565518b2ec2e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:45:52.485690414 +0000 UTC m=+0.269769286,LastTimestamp:2025-12-02 07:45:52.485690414 +0000 UTC m=+0.269769286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.490309 4691 server.go:460] "Adding debug handlers to kubelet server" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.492534 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.492571 4691 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.492875 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:07:45.301950423 +0000 UTC Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.493071 4691 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.493732 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="200ms" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.493993 4691 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.494011 4691 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.494074 4691 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.495000 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.495098 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.495363 4691 factory.go:55] Registering systemd factory Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.495385 4691 factory.go:221] Registration of the systemd container factory successfully Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.495960 4691 factory.go:153] Registering CRI-O factory Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.496011 4691 factory.go:221] Registration of the crio container factory successfully Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.496108 4691 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.496139 4691 factory.go:103] Registering Raw factory Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.496156 4691 manager.go:1196] Started watching for new ooms in manager Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.496707 4691 manager.go:319] Starting recovery of all containers Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507382 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507454 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507467 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507478 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507495 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507512 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507523 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507536 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507551 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507563 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507572 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507586 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507610 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507642 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507669 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.507691 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510227 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510267 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510293 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510311 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510343 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510361 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510377 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510393 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510411 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510426 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510442 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510470 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510485 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510503 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510522 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510538 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510554 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510568 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510584 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510601 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510633 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510662 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510681 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510696 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510708 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510721 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510738 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510752 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510777 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510815 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510827 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510842 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510855 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510876 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510888 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510922 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510944 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510966 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.510984 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511001 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511017 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511030 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511044 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511056 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511070 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511096 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511108 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511123 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511137 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511149 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511165 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511178 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511192 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511209 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511223 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511240 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511464 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511476 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511491 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511503 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511517 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511527 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.511537 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512143 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512494 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512510 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512525 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512543 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512559 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512600 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512631 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512655 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512669 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512683 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512695 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512707 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512723 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512740 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512821 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512906 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512926 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512945 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.512993 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513013 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513039 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513069 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513091 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513119 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513197 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513226 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513262 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513285 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513305 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513327 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513378 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513440 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513461 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513541 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513577 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513600 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513615 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513631 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513649 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513662 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513680 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513694 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513746 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513776 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513805 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513819 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513838 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513852 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513873 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513884 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513923 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513942 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513954 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.513973 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.514105 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.514575 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.515899 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.515954 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.515973 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.515989 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516008 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516023 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516040 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516057 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516073 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516088 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516104 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516121 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516139 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516155 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516172 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516189 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516203 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516218 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516233 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516255 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.516277 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517026 4691 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517082 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517114 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517139 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517159 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517178 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517196 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517218 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517292 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517313 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517334 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517355 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517375 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517391 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517410 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517431 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517453 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517482 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517511 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517545 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517576 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517597 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517619 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517638 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517657 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517673 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517688 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517705 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517721 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517737 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517752 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517773 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517809 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517826 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517841 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517857 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517877 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517894 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517910 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517925 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517941 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517956 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517972 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.517987 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518003 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518020 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518035 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518050 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518066 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518082 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518098 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518111 4691 reconstruct.go:97] "Volume reconstruction finished" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.518120 4691 reconciler.go:26] "Reconciler: start to sync state" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.530649 4691 manager.go:324] Recovery completed Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.542816 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.544485 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.544527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.544538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.546120 4691 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.546516 4691 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.546551 4691 state_mem.go:36] "Initialized new in-memory state store" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.558200 4691 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.560214 4691 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.560275 4691 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.560308 4691 kubelet.go:2335] "Starting kubelet main sync loop" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.561418 4691 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.561939 4691 policy_none.go:49] "None policy: Start" Dec 02 07:45:52 crc kubenswrapper[4691]: W1202 07:45:52.561912 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.561977 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.563223 4691 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.563255 4691 state_mem.go:35] "Initializing new in-memory state store" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.594049 4691 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629022 4691 manager.go:334] "Starting Device Plugin manager" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629083 4691 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629095 4691 server.go:79] "Starting device plugin registration server" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629605 4691 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629622 4691 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629853 4691 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629940 4691 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.629948 4691 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.639182 4691 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.662479 4691 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.662585 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.664154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.664182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.664193 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.664376 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.664547 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.664598 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665253 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665285 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665428 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665698 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665793 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665827 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.665834 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666497 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666619 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666722 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666825 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.666965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667512 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667607 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667759 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.667796 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.669162 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.669177 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.669186 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.669194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.669219 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.669196 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.670403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.670458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.670477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.670722 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.670759 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.672340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.672357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.672364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.695005 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="400ms" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.721805 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.721875 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.721927 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.721975 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722018 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722076 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722112 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722143 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722171 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722189 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722205 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722227 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722241 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722255 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.722269 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.730115 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.731120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.731173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.731189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.731225 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.731857 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823308 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823388 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823428 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823463 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823498 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823528 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823534 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823561 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823557 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823622 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823534 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823632 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823696 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823809 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823820 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823851 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823909 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823932 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823885 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823955 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823973 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823978 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823994 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.824000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.824031 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.823992 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.824042 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.824034 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.824048 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.932920 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.934173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.934208 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.934219 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:52 crc kubenswrapper[4691]: I1202 07:45:52.934242 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:45:52 crc kubenswrapper[4691]: E1202 07:45:52.934658 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.001633 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.019220 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.025997 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.029121 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-5c411d6f75851d7abfa7e959cd47d0183a90a6d300e19ef93d98bfdfcc79d157 WatchSource:0}: Error finding container 5c411d6f75851d7abfa7e959cd47d0183a90a6d300e19ef93d98bfdfcc79d157: Status 404 returned error can't find the container with id 5c411d6f75851d7abfa7e959cd47d0183a90a6d300e19ef93d98bfdfcc79d157 Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.049873 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.050909 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-cd2409d59694bf7c6d3bd36b3011cd8eb89377fef2a3682aadbd5ead69e9acdf WatchSource:0}: Error finding container cd2409d59694bf7c6d3bd36b3011cd8eb89377fef2a3682aadbd5ead69e9acdf: Status 404 returned error can't find the container with id cd2409d59694bf7c6d3bd36b3011cd8eb89377fef2a3682aadbd5ead69e9acdf Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.052589 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-10c230c5122b8e8af8e9cd9377a9c987a8a7ffb1db4cb532fe5d690a9cc92ed9 WatchSource:0}: Error finding container 10c230c5122b8e8af8e9cd9377a9c987a8a7ffb1db4cb532fe5d690a9cc92ed9: Status 404 returned error can't find the container with id 10c230c5122b8e8af8e9cd9377a9c987a8a7ffb1db4cb532fe5d690a9cc92ed9 Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.052978 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.072943 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-206a272b41336b91118179501241e0c8e21482a928729915960110ada0418e3d WatchSource:0}: Error finding container 206a272b41336b91118179501241e0c8e21482a928729915960110ada0418e3d: Status 404 returned error can't find the container with id 206a272b41336b91118179501241e0c8e21482a928729915960110ada0418e3d Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.075301 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-e0338da460f775b01d34c7b0aabaa2a49e9a40cdf83d879c4e86ceed2f59557e WatchSource:0}: Error finding container e0338da460f775b01d34c7b0aabaa2a49e9a40cdf83d879c4e86ceed2f59557e: Status 404 returned error can't find the container with id e0338da460f775b01d34c7b0aabaa2a49e9a40cdf83d879c4e86ceed2f59557e Dec 02 07:45:53 crc kubenswrapper[4691]: E1202 07:45:53.096656 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="800ms" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.335038 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.336073 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.336125 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.336135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.336158 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:45:53 crc kubenswrapper[4691]: E1202 07:45:53.336450 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.222:6443: connect: connection refused" node="crc" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.489112 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.493259 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 22:55:58.931274793 +0000 UTC Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.566000 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df" exitCode=0 Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.566068 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.566158 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cd2409d59694bf7c6d3bd36b3011cd8eb89377fef2a3682aadbd5ead69e9acdf"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.566292 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.567560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.567591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.567605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.567897 4691 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d" exitCode=0 Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.567965 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.567980 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5c411d6f75851d7abfa7e959cd47d0183a90a6d300e19ef93d98bfdfcc79d157"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.568075 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.568856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.568886 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.568895 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.568968 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569318 4691 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd" exitCode=0 Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569343 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569365 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e0338da460f775b01d34c7b0aabaa2a49e9a40cdf83d879c4e86ceed2f59557e"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569404 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.569972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.570198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.570237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.570251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.572177 4691 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201" exitCode=0 Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.572251 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.572296 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"206a272b41336b91118179501241e0c8e21482a928729915960110ada0418e3d"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.572374 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.573313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.573347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.573358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.573919 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab"} Dec 02 07:45:53 crc kubenswrapper[4691]: I1202 07:45:53.573954 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"10c230c5122b8e8af8e9cd9377a9c987a8a7ffb1db4cb532fe5d690a9cc92ed9"} Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.737160 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:53 crc kubenswrapper[4691]: E1202 07:45:53.737234 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:53 crc kubenswrapper[4691]: E1202 07:45:53.897585 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="1.6s" Dec 02 07:45:53 crc kubenswrapper[4691]: W1202 07:45:53.965674 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:53 crc kubenswrapper[4691]: E1202 07:45:53.965778 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:54 crc kubenswrapper[4691]: W1202 07:45:54.018797 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.222:6443: connect: connection refused Dec 02 07:45:54 crc kubenswrapper[4691]: E1202 07:45:54.018863 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.222:6443: connect: connection refused" logger="UnhandledError" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.136736 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.137798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.137835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.137844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.137868 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.494071 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 20:29:45.614217684 +0000 UTC Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.494115 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 132h43m51.120104388s for next certificate rotation Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.580179 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.580245 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.580255 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.580279 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.582378 4691 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad" exitCode=0 Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.582431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.582542 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.583507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.583529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.583537 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.585944 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7970d6804ddbc7e241b8f7d14ac1508db5cc75ba5ff7654a8d5f378c16e498ff"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.586009 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.586605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.586622 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.586631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.589400 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.589431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.589446 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.589526 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.590344 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.590362 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.590370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.593400 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.593423 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.593454 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948"} Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.593542 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.594870 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.594909 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:54 crc kubenswrapper[4691]: I1202 07:45:54.594923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.600296 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df"} Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.600319 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.602258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.602970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.603007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.604549 4691 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94" exitCode=0 Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.604649 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.604672 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94"} Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.604806 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.604956 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.606693 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.606757 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.606792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.608734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.608778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.608805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.608811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.608871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.608915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:55 crc kubenswrapper[4691]: I1202 07:45:55.638458 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.608968 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.609394 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b"} Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.609424 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.609433 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b"} Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.609441 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5"} Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.609487 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.609978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.610025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.610035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.610584 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.610596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:56 crc kubenswrapper[4691]: I1202 07:45:56.610604 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.526946 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.617554 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819"} Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.617652 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e"} Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.617597 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.617695 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.619589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.619632 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.619647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.620705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.620741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:57 crc kubenswrapper[4691]: I1202 07:45:57.620754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.353225 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.353985 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.356145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.356199 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.356220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.534089 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.619791 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.619865 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.620738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.620788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.620755 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.620867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.620799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.621140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.638977 4691 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.639070 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.892541 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.892696 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.894093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.894179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:58 crc kubenswrapper[4691]: I1202 07:45:58.894196 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.445362 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.453971 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.621854 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.621867 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.623408 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.623459 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.623471 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.623849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.623992 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:45:59 crc kubenswrapper[4691]: I1202 07:45:59.624114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:00 crc kubenswrapper[4691]: I1202 07:46:00.624256 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:00 crc kubenswrapper[4691]: I1202 07:46:00.625312 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:00 crc kubenswrapper[4691]: I1202 07:46:00.625340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:00 crc kubenswrapper[4691]: I1202 07:46:00.625349 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:01 crc kubenswrapper[4691]: I1202 07:46:01.891660 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 02 07:46:01 crc kubenswrapper[4691]: I1202 07:46:01.891882 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:01 crc kubenswrapper[4691]: I1202 07:46:01.893400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:01 crc kubenswrapper[4691]: I1202 07:46:01.893491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:01 crc kubenswrapper[4691]: I1202 07:46:01.893525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:02 crc kubenswrapper[4691]: I1202 07:46:02.204401 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 02 07:46:02 crc kubenswrapper[4691]: I1202 07:46:02.633492 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:02 crc kubenswrapper[4691]: I1202 07:46:02.634600 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:02 crc kubenswrapper[4691]: I1202 07:46:02.634633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:02 crc kubenswrapper[4691]: I1202 07:46:02.634648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:02 crc kubenswrapper[4691]: E1202 07:46:02.639409 4691 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 02 07:46:04 crc kubenswrapper[4691]: W1202 07:46:04.059286 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.059397 4691 trace.go:236] Trace[2134008521]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:45:54.057) (total time: 10001ms): Dec 02 07:46:04 crc kubenswrapper[4691]: Trace[2134008521]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (07:46:04.059) Dec 02 07:46:04 crc kubenswrapper[4691]: Trace[2134008521]: [10.001738446s] [10.001738446s] END Dec 02 07:46:04 crc kubenswrapper[4691]: E1202 07:46:04.059424 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 02 07:46:04 crc kubenswrapper[4691]: E1202 07:46:04.138373 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.454121 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.454327 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.455491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.455532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.455544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.490656 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.969886 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.969951 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.983349 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 02 07:46:04 crc kubenswrapper[4691]: I1202 07:46:04.983413 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.506849 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.507360 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.739218 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.740566 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.740602 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.740614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:05 crc kubenswrapper[4691]: I1202 07:46:05.740641 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 02 07:46:06 crc kubenswrapper[4691]: I1202 07:46:06.476309 4691 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.362369 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.362548 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.363720 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.363909 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.364045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.537140 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.537308 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.537617 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.537673 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.538647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.538681 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.538695 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.543841 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.639041 4691 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.639111 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.648183 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.648623 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.648684 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.649169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.649204 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:08 crc kubenswrapper[4691]: I1202 07:46:08.649215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:09 crc kubenswrapper[4691]: E1202 07:46:09.972941 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="3.2s" Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.974220 4691 trace.go:236] Trace[1689259404]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:45:55.796) (total time: 14177ms): Dec 02 07:46:09 crc kubenswrapper[4691]: Trace[1689259404]: ---"Objects listed" error: 14177ms (07:46:09.974) Dec 02 07:46:09 crc kubenswrapper[4691]: Trace[1689259404]: [14.177888606s] [14.177888606s] END Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.974249 4691 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.976102 4691 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.976154 4691 trace.go:236] Trace[943190861]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:45:56.159) (total time: 13816ms): Dec 02 07:46:09 crc kubenswrapper[4691]: Trace[943190861]: ---"Objects listed" error: 13816ms (07:46:09.975) Dec 02 07:46:09 crc kubenswrapper[4691]: Trace[943190861]: [13.81693686s] [13.81693686s] END Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.976174 4691 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.976266 4691 trace.go:236] Trace[102347729]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (02-Dec-2025 07:45:56.104) (total time: 13871ms): Dec 02 07:46:09 crc kubenswrapper[4691]: Trace[102347729]: ---"Objects listed" error: 13871ms (07:46:09.976) Dec 02 07:46:09 crc kubenswrapper[4691]: Trace[102347729]: [13.87176522s] [13.87176522s] END Dec 02 07:46:09 crc kubenswrapper[4691]: I1202 07:46:09.976289 4691 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.485499 4691 apiserver.go:52] "Watching apiserver" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.487915 4691 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488156 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-v26sg"] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488615 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488648 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488712 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488772 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488864 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488901 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.488913 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.489192 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.489194 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.489309 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.490424 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.491422 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.491583 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.493284 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.493668 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494363 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494478 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494648 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494848 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494933 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494654 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.494682 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.495437 4691 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.507053 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.520258 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.528225 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.542911 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.557426 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.570715 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578658 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578717 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578747 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578781 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578801 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578816 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578839 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578855 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578870 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578885 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578902 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578917 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578933 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578950 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578969 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.578983 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579000 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579016 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579031 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579046 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579063 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579078 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579092 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579107 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579122 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579083 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579137 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579155 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579171 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579187 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579203 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579218 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579249 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579266 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579283 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579297 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579333 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579351 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579365 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579380 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579395 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579410 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579429 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579445 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579461 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579479 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579495 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579510 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579525 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579540 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579558 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579572 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579586 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579603 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579616 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579674 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579706 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579722 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579738 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579752 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579787 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579804 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579819 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579836 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579852 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579867 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579884 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579901 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579918 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579934 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579968 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579986 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580002 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580017 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580031 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580046 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580063 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580111 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580128 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580144 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580160 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580178 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580195 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580210 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580226 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580252 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580284 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580302 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580317 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580334 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580357 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580380 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580404 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580430 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580451 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580637 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580664 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580696 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580749 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580799 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580814 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580831 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580849 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580879 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580916 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580937 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580989 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581011 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581033 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581053 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581080 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581103 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581126 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581153 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581178 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581200 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581223 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581247 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581271 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581490 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581519 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581546 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581571 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581597 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581625 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581646 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581671 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581694 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581713 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581728 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581744 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581780 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581797 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581814 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581830 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581845 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581860 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581878 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581902 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581924 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581945 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581962 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581980 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581999 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582016 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582031 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582047 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582062 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582078 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582097 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582114 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582131 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582147 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582164 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582182 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582198 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582218 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582235 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582252 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582272 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582289 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582306 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582325 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582342 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582358 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582376 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582391 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582408 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582434 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582452 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582468 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582483 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582502 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582519 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582539 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582558 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582577 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582595 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582614 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582630 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582647 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582665 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582682 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582700 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582717 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582735 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582752 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583379 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583403 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583422 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583440 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583457 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583474 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583490 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583506 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583521 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583561 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583599 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583621 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583640 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583659 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwn7b\" (UniqueName: \"kubernetes.io/projected/be5dfd05-e2d4-460a-93e2-5e138f0dc58c-kube-api-access-hwn7b\") pod \"node-resolver-v26sg\" (UID: \"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\") " pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583679 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/be5dfd05-e2d4-460a-93e2-5e138f0dc58c-hosts-file\") pod \"node-resolver-v26sg\" (UID: \"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\") " pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583700 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583718 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583760 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583795 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583814 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583831 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583848 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583887 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583931 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579136 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579644 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579639 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579811 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579867 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.579881 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580145 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580377 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580368 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580409 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580482 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.590143 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580629 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580655 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580724 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580898 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.580942 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581627 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.581851 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582058 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582076 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582073 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582090 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582299 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582301 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582511 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582521 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582541 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582585 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582902 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.582901 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.583077 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.584314 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.584797 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.586408 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.586448 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.586509 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.586744 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.586901 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587156 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587100 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587333 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587405 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587632 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587594 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.587864 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.588006 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.588030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.588087 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.588170 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.588329 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.588677 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589044 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589202 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589426 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589592 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.590492 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589690 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589776 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.589968 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.590706 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.590987 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.591440 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.591529 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.591833 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.592197 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.592406 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.593537 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.593872 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.593908 4691 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.594385 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.594633 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.594888 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595143 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595145 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.590287 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595290 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595410 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595593 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595596 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595644 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.595703 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.597032 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.597120 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.597329 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.594262 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.597616 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.597984 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598004 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598249 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598307 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598340 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6gcsh"] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598536 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598620 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598819 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-rs5s2"] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.598881 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.599024 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.599204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.599335 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.599508 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.600147 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.600399 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.600449 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.600644 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.600660 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.601173 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:46:11.101144524 +0000 UTC m=+18.885223386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.601362 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.601893 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.601998 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602083 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602170 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602177 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602307 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-mgbt6"] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.590071 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602798 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602956 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602996 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.602993 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.603547 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.605865 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.606678 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.606720 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.606755 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.606864 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.606782 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.607060 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.607095 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.607360 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.607828 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608001 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.608069 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608308 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608442 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608524 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608579 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.608851 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.609281 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:11.107593372 +0000 UTC m=+18.891672234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.609407 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:11.109389489 +0000 UTC m=+18.893468351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.609447 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.609704 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.609748 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.609940 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.610010 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.609971 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.610248 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.610310 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.610652 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.611033 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.611285 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.611701 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.611850 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.611901 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.612291 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.612694 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.611733 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613029 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613385 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613601 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613643 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613716 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613920 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.613966 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.614729 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.614802 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.612165 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615001 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615054 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615121 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615167 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615435 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615506 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615535 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615787 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615932 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615987 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.615649 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616100 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616098 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616291 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616308 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616428 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.616564 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.616593 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.616608 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616663 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.616709 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:11.116694889 +0000 UTC m=+18.900773751 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616827 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616880 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.616568 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.617156 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.617359 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.618239 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.618783 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.618934 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.619030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.619044 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.619075 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.619091 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.619099 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.619173 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:11.119141753 +0000 UTC m=+18.903220615 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.619957 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.620013 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.620211 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.620263 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.625931 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.626427 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.626454 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.626522 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.626935 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.627150 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.627583 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.627945 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.628495 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.628564 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.630347 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.630555 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.630569 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.631057 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.632520 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.633991 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.635160 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.635353 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.635500 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.638313 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.638951 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.639344 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.639514 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.639524 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.640186 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.640477 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.648232 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.668688 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.671422 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.674135 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.674376 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df" exitCode=255 Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.674417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df"} Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.676226 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.681439 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.682231 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685006 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-socket-dir-parent\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685043 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-cni-multus\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685062 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685080 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwn7b\" (UniqueName: \"kubernetes.io/projected/be5dfd05-e2d4-460a-93e2-5e138f0dc58c-kube-api-access-hwn7b\") pod \"node-resolver-v26sg\" (UID: \"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\") " pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685095 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cni-binary-copy\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685111 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-k8s-cni-cncf-io\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-cni-binary-copy\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685167 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-netns\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685183 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-hostroot\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685196 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-system-cni-dir\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685215 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-os-release\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685228 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-kubelet\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685264 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-conf-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-multus-certs\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685295 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/be5dfd05-e2d4-460a-93e2-5e138f0dc58c-hosts-file\") pod \"node-resolver-v26sg\" (UID: \"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\") " pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685324 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drlmn\" (UniqueName: \"kubernetes.io/projected/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-kube-api-access-drlmn\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685345 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-os-release\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685364 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-cni-bin\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685378 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685391 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685405 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8c92\" (UniqueName: \"kubernetes.io/projected/5ce5053b-1d3d-4bc9-9b65-a38112c18218-kube-api-access-x8c92\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685422 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-system-cni-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-cni-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685455 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-cnibin\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685487 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cnibin\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685502 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-daemon-config\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685519 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-etc-kubernetes\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685582 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685591 4691 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685600 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685609 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685618 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685626 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685635 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685644 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685652 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685660 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685668 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685677 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685684 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685693 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685701 4691 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685710 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.685718 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686034 4691 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686051 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686067 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686078 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686075 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686089 4691 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686258 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686581 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686689 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/be5dfd05-e2d4-460a-93e2-5e138f0dc58c-hosts-file\") pod \"node-resolver-v26sg\" (UID: \"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\") " pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.686212 4691 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687555 4691 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687596 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687611 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687623 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687634 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687645 4691 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687658 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687669 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687680 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687692 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687703 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.687716 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688279 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688302 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688316 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688330 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688342 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688390 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688405 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688417 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688462 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688474 4691 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688486 4691 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688523 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688535 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688546 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688645 4691 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688657 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688694 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688719 4691 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688731 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688741 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688752 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.688988 4691 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689000 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689011 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689027 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689039 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689051 4691 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689064 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689075 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689087 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689099 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689111 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689145 4691 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689157 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689168 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689180 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689192 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689203 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689216 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689228 4691 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689240 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689250 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689261 4691 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689252 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689273 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689382 4691 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689395 4691 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689406 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689418 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689429 4691 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689440 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689451 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689464 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689474 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689485 4691 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689497 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689515 4691 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689527 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689539 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689551 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689563 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689574 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689587 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689599 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689611 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689623 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689638 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689650 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689661 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689672 4691 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689684 4691 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689696 4691 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689709 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689720 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689731 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689743 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689774 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689787 4691 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689798 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689810 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689822 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689836 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689848 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689859 4691 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689871 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689883 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689893 4691 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689905 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689916 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689927 4691 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689939 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689951 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689962 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689974 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.689986 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690008 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690019 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690030 4691 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690040 4691 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690052 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690064 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690076 4691 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690088 4691 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690100 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690111 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690122 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690133 4691 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690145 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690157 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690168 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690179 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690190 4691 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690202 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690215 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690226 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690238 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690248 4691 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690259 4691 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690270 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690282 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690295 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690306 4691 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690317 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690329 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690340 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690351 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690362 4691 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690374 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690384 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690395 4691 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690406 4691 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690418 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690429 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690442 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690455 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690467 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690478 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690490 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690501 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690512 4691 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690524 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690535 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690563 4691 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690575 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690587 4691 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690599 4691 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690610 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690621 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690632 4691 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690644 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690656 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690670 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690682 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690694 4691 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690708 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690721 4691 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690748 4691 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.690778 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.696930 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.700708 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwn7b\" (UniqueName: \"kubernetes.io/projected/be5dfd05-e2d4-460a-93e2-5e138f0dc58c-kube-api-access-hwn7b\") pod \"node-resolver-v26sg\" (UID: \"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\") " pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.705812 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.715033 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.723901 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.732507 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.739707 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.747889 4691 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.748142 4691 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.748840 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.749346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.749474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.749675 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.749750 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.749828 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.766384 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.768234 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.771633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.771669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.771680 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.771697 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.771707 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.773596 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.773630 4691 scope.go:117] "RemoveContainer" containerID="ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.783805 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.791190 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.791450 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-system-cni-dir\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.791508 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-os-release\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.791540 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-kubelet\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792236 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82103e10-1127-4a84-b5fc-9d0d6a259932-proxy-tls\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792378 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-conf-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792408 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-multus-certs\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792487 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drlmn\" (UniqueName: \"kubernetes.io/projected/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-kube-api-access-drlmn\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792604 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-os-release\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792692 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-os-release\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792738 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-kubelet\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792756 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-system-cni-dir\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792839 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-conf-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792881 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-multus-certs\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.792523 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-os-release\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793229 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-cni-bin\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793292 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793322 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793366 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8c92\" (UniqueName: \"kubernetes.io/projected/5ce5053b-1d3d-4bc9-9b65-a38112c18218-kube-api-access-x8c92\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793393 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-system-cni-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793449 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-cni-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793504 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-cnibin\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793536 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-daemon-config\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793639 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cnibin\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793678 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-etc-kubernetes\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793751 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-socket-dir-parent\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793804 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-cni-multus\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793835 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/82103e10-1127-4a84-b5fc-9d0d6a259932-rootfs\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793860 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cni-binary-copy\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793890 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-k8s-cni-cncf-io\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793921 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psldk\" (UniqueName: \"kubernetes.io/projected/82103e10-1127-4a84-b5fc-9d0d6a259932-kube-api-access-psldk\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.793970 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-cni-binary-copy\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794019 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-netns\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794066 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-hostroot\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794098 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82103e10-1127-4a84-b5fc-9d0d6a259932-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794141 4691 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794204 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-cni-bin\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794570 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.795327 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-k8s-cni-cncf-io\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.795435 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-var-lib-cni-multus\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.794804 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-socket-dir-parent\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796214 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cni-binary-copy\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796290 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-host-run-netns\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796360 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-cnibin\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796460 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cnibin\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796550 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-cni-binary-copy\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796588 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796603 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796629 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-hostroot\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796645 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-etc-kubernetes\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796654 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5ce5053b-1d3d-4bc9-9b65-a38112c18218-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.796705 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-system-cni-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.797384 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-cni-dir\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.797731 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-multus-daemon-config\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.802413 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.810266 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.810588 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.813438 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.816111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.816140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.816150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.816166 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.816177 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.823062 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8c92\" (UniqueName: \"kubernetes.io/projected/5ce5053b-1d3d-4bc9-9b65-a38112c18218-kube-api-access-x8c92\") pod \"multus-additional-cni-plugins-rs5s2\" (UID: \"5ce5053b-1d3d-4bc9-9b65-a38112c18218\") " pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.823161 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.827064 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.827122 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.827275 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drlmn\" (UniqueName: \"kubernetes.io/projected/eb6171dd-c2ea-4c52-b906-e8a9a7ff6537-kube-api-access-drlmn\") pod \"multus-6gcsh\" (UID: \"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\") " pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.829477 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v26sg" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.835002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.835036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.835044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.835060 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.835069 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.847561 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: E1202 07:46:10.847797 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.848038 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.851370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.851396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.851404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.851419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.851429 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: W1202 07:46:10.858712 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5dfd05_e2d4_460a_93e2_5e138f0dc58c.slice/crio-ef6c7c185655e95a0e3752541d47ef2be3de9c10bc39ea5968a51164348c34e3 WatchSource:0}: Error finding container ef6c7c185655e95a0e3752541d47ef2be3de9c10bc39ea5968a51164348c34e3: Status 404 returned error can't find the container with id ef6c7c185655e95a0e3752541d47ef2be3de9c10bc39ea5968a51164348c34e3 Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.894681 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/82103e10-1127-4a84-b5fc-9d0d6a259932-rootfs\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.894993 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psldk\" (UniqueName: \"kubernetes.io/projected/82103e10-1127-4a84-b5fc-9d0d6a259932-kube-api-access-psldk\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.895018 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82103e10-1127-4a84-b5fc-9d0d6a259932-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.895041 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82103e10-1127-4a84-b5fc-9d0d6a259932-proxy-tls\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.894870 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/82103e10-1127-4a84-b5fc-9d0d6a259932-rootfs\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.895838 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/82103e10-1127-4a84-b5fc-9d0d6a259932-mcd-auth-proxy-config\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.901002 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/82103e10-1127-4a84-b5fc-9d0d6a259932-proxy-tls\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.914491 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psldk\" (UniqueName: \"kubernetes.io/projected/82103e10-1127-4a84-b5fc-9d0d6a259932-kube-api-access-psldk\") pod \"machine-config-daemon-mgbt6\" (UID: \"82103e10-1127-4a84-b5fc-9d0d6a259932\") " pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.952044 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6gcsh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.957187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.957225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.957235 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.957251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.957263 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:10Z","lastTransitionTime":"2025-12-02T07:46:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.962432 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.972714 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pgxh"] Dec 02 07:46:10 crc kubenswrapper[4691]: W1202 07:46:10.972781 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6171dd_c2ea_4c52_b906_e8a9a7ff6537.slice/crio-935050c0b4c141bc59ace1029ec153878d4b674c1f422045b02d7969c192ffcb WatchSource:0}: Error finding container 935050c0b4c141bc59ace1029ec153878d4b674c1f422045b02d7969c192ffcb: Status 404 returned error can't find the container with id 935050c0b4c141bc59ace1029ec153878d4b674c1f422045b02d7969c192ffcb Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.973489 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.978308 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.986170 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.986498 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.986817 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.986946 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.987099 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.987271 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 07:46:10 crc kubenswrapper[4691]: I1202 07:46:10.989588 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.022072 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.054287 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.071029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.071222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.071305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.071383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.071469 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.074872 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101061 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101351 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-var-lib-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-node-log\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101557 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-config\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101647 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3605748c-8980-4aa9-8d28-f18a17aa8124-ovn-node-metrics-cert\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101739 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-bin\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101842 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-log-socket\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.101922 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-script-lib\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102037 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-systemd\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102214 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-kubelet\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102300 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8q22\" (UniqueName: \"kubernetes.io/projected/3605748c-8980-4aa9-8d28-f18a17aa8124-kube-api-access-s8q22\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102383 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-systemd-units\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102469 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-slash\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102558 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-ovn\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102639 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102749 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-env-overrides\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102798 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-netd\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102817 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102858 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-netns\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.102874 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-etc-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.128800 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.147045 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.160940 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.178456 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.191200 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.196011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.196058 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.196071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.196088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.196100 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203513 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203574 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203595 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203613 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203628 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-netns\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203641 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-etc-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203659 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-var-lib-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203673 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-node-log\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-config\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203699 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3605748c-8980-4aa9-8d28-f18a17aa8124-ovn-node-metrics-cert\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203716 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203731 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-bin\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203746 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-log-socket\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203772 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-script-lib\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203789 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-systemd\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203804 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203818 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203833 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8q22\" (UniqueName: \"kubernetes.io/projected/3605748c-8980-4aa9-8d28-f18a17aa8124-kube-api-access-s8q22\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203846 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-kubelet\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203860 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-systemd-units\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203872 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-slash\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203891 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-ovn\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203904 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203918 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-env-overrides\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203932 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-netd\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.203981 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-netd\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.204039 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:46:12.20402565 +0000 UTC m=+19.988104512 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204059 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-ovn-kubernetes\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.204094 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.204116 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:12.204111092 +0000 UTC m=+19.988189954 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.204165 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.204184 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:12.204179374 +0000 UTC m=+19.988258236 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204202 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-netns\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204221 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-etc-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204240 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-var-lib-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204259 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-node-log\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204393 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-slash\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204726 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-kubelet\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204747 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-config\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204787 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-systemd-units\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204799 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-ovn\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204827 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-log-socket\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.204856 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.205210 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-env-overrides\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.205245 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-systemd\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.205621 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-openvswitch\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.205656 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-script-lib\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.205675 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.205689 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-bin\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.205696 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.205707 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.205739 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:12.205728554 +0000 UTC m=+19.989807476 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.205977 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.209004 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3605748c-8980-4aa9-8d28-f18a17aa8124-ovn-node-metrics-cert\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.216388 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.217950 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.217994 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.218008 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:11 crc kubenswrapper[4691]: E1202 07:46:11.218102 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:12.218083836 +0000 UTC m=+20.002162698 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.223177 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8q22\" (UniqueName: \"kubernetes.io/projected/3605748c-8980-4aa9-8d28-f18a17aa8124-kube-api-access-s8q22\") pod \"ovnkube-node-7pgxh\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.239839 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.299492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.299527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.299538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.299553 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.299564 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.324105 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:11 crc kubenswrapper[4691]: W1202 07:46:11.338082 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3605748c_8980_4aa9_8d28_f18a17aa8124.slice/crio-5d3d1eb7e31f60d5fe2f562867d91bc5039d6333763d3387e87aafd096cc1ede WatchSource:0}: Error finding container 5d3d1eb7e31f60d5fe2f562867d91bc5039d6333763d3387e87aafd096cc1ede: Status 404 returned error can't find the container with id 5d3d1eb7e31f60d5fe2f562867d91bc5039d6333763d3387e87aafd096cc1ede Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.400011 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.402019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.402063 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.402075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.402091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.402856 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.505037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.505074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.505087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.505104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.505117 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.607747 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.607786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.607794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.607811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.607820 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.683164 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.683245 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.683255 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e899becba35325e76da9dceaca41091d884da49412a6247620e908facc6b7d66"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.684922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.684946 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.684956 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"c0e374b359a08cc0272bd467de857c7c67538aad4e81d00f69d21a73858c795d"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.688178 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.690108 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.690684 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.692049 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v26sg" event={"ID":"be5dfd05-e2d4-460a-93e2-5e138f0dc58c","Type":"ContainerStarted","Data":"137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.692081 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v26sg" event={"ID":"be5dfd05-e2d4-460a-93e2-5e138f0dc58c","Type":"ContainerStarted","Data":"ef6c7c185655e95a0e3752541d47ef2be3de9c10bc39ea5968a51164348c34e3"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.695711 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerStarted","Data":"73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.695737 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerStarted","Data":"935050c0b4c141bc59ace1029ec153878d4b674c1f422045b02d7969c192ffcb"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.696823 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"00fb20a66627f4bf44a6b5d8f125abd1732ab32ef7bc7609af32c25a7edbfe69"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.697239 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.698357 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ce5053b-1d3d-4bc9-9b65-a38112c18218" containerID="f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4" exitCode=0 Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.698403 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerDied","Data":"f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.698419 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerStarted","Data":"d05d759afacbd1bf6a9807b6f3ae2fc3a5d7a944b40facdf83b2c1c1ca8793fb"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.699687 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.699706 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"44a2cf25b9e2da228d27365b62d0ac9a2cf58e94299bcce827eac86f913aa4df"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.704163 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c" exitCode=0 Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.704193 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.704211 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"5d3d1eb7e31f60d5fe2f562867d91bc5039d6333763d3387e87aafd096cc1ede"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.709187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.709214 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.709223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.709236 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.709246 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.714443 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.731097 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.742505 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.761251 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.776076 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.791577 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.806558 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.832712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.832787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.832800 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.832817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.832827 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.840916 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.854697 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.870041 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.886124 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.899498 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.909996 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.924109 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.933732 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.935158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.935200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.935211 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.935227 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.935237 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:11Z","lastTransitionTime":"2025-12-02T07:46:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.946827 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.961742 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:11 crc kubenswrapper[4691]: I1202 07:46:11.973636 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.006083 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:11Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.037678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.037717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.037727 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.037741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.037752 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.037995 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.063837 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.095562 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.114530 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.141151 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.141190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.141203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.141218 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.141226 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.211841 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.211987 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:46:14.211959562 +0000 UTC m=+21.996038424 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.212315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212331 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.212390 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212398 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:14.212384673 +0000 UTC m=+21.996463535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.212469 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212544 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212621 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:14.212602199 +0000 UTC m=+21.996681061 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212619 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212649 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212660 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.212789 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:14.212684281 +0000 UTC m=+21.996763143 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.235181 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.260046 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.260085 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.264446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.264477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.264489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.264507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.264519 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.265238 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.286647 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.300635 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.314105 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.314302 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.314350 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.314364 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.314441 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:14.314423365 +0000 UTC m=+22.098502217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.317802 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.335860 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.349851 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.364061 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.367405 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.367441 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.367451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.367465 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.367475 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.376973 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.400575 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.416063 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.428888 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.443027 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.456109 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.469712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.469775 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.469789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.469808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.469821 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.499562 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.538991 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.560522 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.560646 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.560739 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.560802 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.560884 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:12 crc kubenswrapper[4691]: E1202 07:46:12.561088 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.564939 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.566024 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.567282 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.568256 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.571701 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.572479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.572513 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.572523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.572538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.572547 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.587904 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.614647 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.635438 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.636426 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.655076 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.675080 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.675120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.675132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.675146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.675157 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.676523 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.696558 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.701163 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.712879 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ce5053b-1d3d-4bc9-9b65-a38112c18218" containerID="dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73" exitCode=0 Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.724336 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.725195 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.726499 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.727239 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.727895 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.728890 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.729828 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.730854 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.731313 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.732361 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.733130 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.734223 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.734792 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.735207 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.736244 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.736285 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.736686 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.737668 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.744100 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.744569 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.745611 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.746166 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.746632 4691 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.747162 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.748936 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.749453 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.750301 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.751949 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.752630 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.753536 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.754232 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.755276 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.755792 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.756720 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.757474 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.758656 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.759136 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.760132 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.760780 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.761969 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.762595 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.763632 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.764241 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.765381 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.766137 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.766667 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.767502 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerDied","Data":"dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.767544 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.767558 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.767567 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.767577 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.775009 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.777429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.777490 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.777505 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.777528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.777541 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.814309 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.824262 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-rg26n"] Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.824601 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.847944 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.867540 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.880212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.880247 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.880257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.880272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.880281 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.887514 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.908424 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.921998 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-host\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.922125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-strhx\" (UniqueName: \"kubernetes.io/projected/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-kube-api-access-strhx\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.922280 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-serviceca\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.936127 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.983173 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.987505 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.987541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.987552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.987575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:12 crc kubenswrapper[4691]: I1202 07:46:12.987586 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:12Z","lastTransitionTime":"2025-12-02T07:46:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.014946 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.023474 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-serviceca\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.023528 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-host\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.023547 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strhx\" (UniqueName: \"kubernetes.io/projected/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-kube-api-access-strhx\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.023713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-host\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.024517 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-serviceca\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.066501 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-strhx\" (UniqueName: \"kubernetes.io/projected/0707e5b7-0f43-45e6-a9c9-af60cbbe31de-kube-api-access-strhx\") pod \"node-ca-rg26n\" (UID: \"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\") " pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.076526 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.095995 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.096044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.096057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.096073 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.096085 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.119286 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.156050 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.190983 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-rg26n" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.199435 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.200608 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.200643 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.200654 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.200670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.200681 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: W1202 07:46:13.211626 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0707e5b7_0f43_45e6_a9c9_af60cbbe31de.slice/crio-ba9662ef7120f65e7c52e57a6bfb1393b7b028411a29957e8b06188c2dc3f65a WatchSource:0}: Error finding container ba9662ef7120f65e7c52e57a6bfb1393b7b028411a29957e8b06188c2dc3f65a: Status 404 returned error can't find the container with id ba9662ef7120f65e7c52e57a6bfb1393b7b028411a29957e8b06188c2dc3f65a Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.240158 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.306780 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.310400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.310426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.310434 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.310447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.310455 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.335110 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.358173 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.395095 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.412261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.412299 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.412309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.412323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.412332 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.434643 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.473620 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.514913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.514953 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.514965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.514981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.514994 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.516942 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.552920 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.602705 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.617235 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.617270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.617282 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.617298 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.617308 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.634332 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.673232 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.720882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.720933 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.720942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.720955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.720964 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.725224 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ce5053b-1d3d-4bc9-9b65-a38112c18218" containerID="dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9" exitCode=0 Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.725286 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerDied","Data":"dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.727376 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.729779 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rg26n" event={"ID":"0707e5b7-0f43-45e6-a9c9-af60cbbe31de","Type":"ContainerStarted","Data":"8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.729831 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-rg26n" event={"ID":"0707e5b7-0f43-45e6-a9c9-af60cbbe31de","Type":"ContainerStarted","Data":"ba9662ef7120f65e7c52e57a6bfb1393b7b028411a29957e8b06188c2dc3f65a"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.730326 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.733424 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.733520 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.754211 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.794634 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.823817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.823856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.823867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.823882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.823893 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.834102 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.876163 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.920789 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.926499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.926530 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.926541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.926557 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.926567 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:13Z","lastTransitionTime":"2025-12-02T07:46:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.956016 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:13 crc kubenswrapper[4691]: I1202 07:46:13.995495 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:13Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.028631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.028666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.028675 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.028689 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.028699 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.037579 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.075492 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.115299 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.130430 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.130510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.130520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.130532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.130549 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.157886 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.192802 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.232519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.232551 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.232561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.232574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.232583 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.234977 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.235087 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.235116 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.235148 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235197 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:46:18.235169257 +0000 UTC m=+26.019248209 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235280 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235300 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235313 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235370 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:18.235353771 +0000 UTC m=+26.019432713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235441 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235517 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:18.235506555 +0000 UTC m=+26.019585527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.235521 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235631 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.235683 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:18.23566906 +0000 UTC m=+26.019748042 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.275890 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.322034 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.334445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.334492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.334502 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.334520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.334529 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.336227 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.336365 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.336387 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.336397 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.336434 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:18.336423798 +0000 UTC m=+26.120502660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.356269 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.395297 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.436160 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.436201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.436214 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.436231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.436242 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.441739 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.476859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.516319 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.539001 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.539038 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.539057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.539075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.539087 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.556131 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.561321 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.561460 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.561564 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.561805 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.561934 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:14 crc kubenswrapper[4691]: E1202 07:46:14.562095 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.595097 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.636989 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.641422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.641451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.641461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.641475 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.641485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.676381 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.715485 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.738326 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ce5053b-1d3d-4bc9-9b65-a38112c18218" containerID="6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859" exitCode=0 Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.738393 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerDied","Data":"6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.743196 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.743334 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.743415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.743485 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.743541 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.759076 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.796612 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.837153 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.847774 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.847822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.847834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.847849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.847859 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.875712 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.915425 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.950372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.950418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.950427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.950442 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.950451 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:14Z","lastTransitionTime":"2025-12-02T07:46:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:14 crc kubenswrapper[4691]: I1202 07:46:14.959104 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:14Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.004775 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.044415 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.052148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.052178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.052189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.052204 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.052214 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.075626 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.114271 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.154316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.154349 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.154358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.154372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.154384 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.155401 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.194136 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.234974 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.256173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.256210 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.256222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.256236 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.256245 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.280221 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.357752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.357805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.357816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.357833 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.357843 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.460096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.460135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.460146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.460164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.460178 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.562646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.562918 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.562983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.563052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.563129 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.641893 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.644844 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.650038 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.662509 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.664620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.664646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.664656 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.664669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.664680 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.674329 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.684135 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.693517 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.706529 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.714608 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.725832 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.741024 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.744379 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ce5053b-1d3d-4bc9-9b65-a38112c18218" containerID="8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b" exitCode=0 Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.744440 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerDied","Data":"8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.748278 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.759748 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.766701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.766738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.766748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.766779 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.766791 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.779547 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.797374 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.809470 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.820087 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.857850 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.870375 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.870411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.870423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.870439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.870449 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.897385 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.937197 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.972226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.972265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.972274 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.972290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.972299 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:15Z","lastTransitionTime":"2025-12-02T07:46:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:15 crc kubenswrapper[4691]: I1202 07:46:15.976000 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:15Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.017318 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.056418 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.074956 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.075009 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.075020 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.075036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.075097 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.095975 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.136137 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.177418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.177454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.177462 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.177476 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.177485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.178140 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.214335 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.254859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.279416 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.279467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.279483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.279503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.279517 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.298311 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.343868 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.376317 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.382454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.382509 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.382525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.382548 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.382563 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.416197 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.462543 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.484297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.484332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.484340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.484353 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.484362 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.561920 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.562009 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.561950 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:16 crc kubenswrapper[4691]: E1202 07:46:16.562083 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:16 crc kubenswrapper[4691]: E1202 07:46:16.562186 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:16 crc kubenswrapper[4691]: E1202 07:46:16.562249 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.586728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.586786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.586799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.586813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.586823 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.688934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.688969 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.688978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.688990 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.689001 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.754103 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ce5053b-1d3d-4bc9-9b65-a38112c18218" containerID="0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5" exitCode=0 Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.754157 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerDied","Data":"0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.775566 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.790918 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.790956 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.790966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.790981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.790990 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.794887 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.807024 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.820580 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.833911 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.844446 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.855403 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.866592 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.884641 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.896394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.896705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.896717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.896734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.896745 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.897536 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.909625 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.938950 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.975414 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:16Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.999057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.999088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.999098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.999113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:16 crc kubenswrapper[4691]: I1202 07:46:16.999126 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:16Z","lastTransitionTime":"2025-12-02T07:46:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.015593 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.056368 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.101348 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.101383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.101394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.101411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.101424 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.203690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.203728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.203740 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.203777 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.203792 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.307641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.307691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.307700 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.307716 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.307727 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.409787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.409818 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.409827 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.409839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.409849 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.512336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.512367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.512376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.512389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.512397 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.615341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.615387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.615397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.615412 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.615424 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.718401 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.718445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.718458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.718476 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.718628 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.762000 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" event={"ID":"5ce5053b-1d3d-4bc9-9b65-a38112c18218","Type":"ContainerStarted","Data":"5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.766442 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.766803 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.774336 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.784942 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.790166 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.800633 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.811984 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.821230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.821268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.821279 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.821298 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.821309 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.823500 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.840226 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.860503 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.880415 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.893508 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.903043 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.911371 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.924165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.924388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.924478 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.924562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.924636 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:17Z","lastTransitionTime":"2025-12-02T07:46:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.927452 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.938868 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.949212 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.957685 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.967042 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.977279 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:17 crc kubenswrapper[4691]: I1202 07:46:17.989172 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.000831 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:17Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.013847 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.026528 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.027475 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.027504 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.027547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.027565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.027576 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.037264 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.047945 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.056311 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.066415 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.094991 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.130748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.130798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.130808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.130824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.130835 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.148461 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.179886 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.219267 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.233503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.233578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.233605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.233639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.233664 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.262825 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.275040 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.275315 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:46:26.275281483 +0000 UTC m=+34.059360375 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.275569 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.275708 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.275811 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:26.275794406 +0000 UTC m=+34.059873308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.275710 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.275975 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.276174 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.276394 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:26.276370051 +0000 UTC m=+34.060448963 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.276195 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.276644 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.276747 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.276916 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:26.276903815 +0000 UTC m=+34.060982687 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.336461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.336504 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.336517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.336535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.336547 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.376993 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.377125 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.377144 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.377157 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.377216 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:26.377200411 +0000 UTC m=+34.161279283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.439454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.439531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.439555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.439588 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.439653 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.542690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.542730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.542830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.542865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.542887 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.561423 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.561454 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.561500 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.561553 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.561637 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:18 crc kubenswrapper[4691]: E1202 07:46:18.561718 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.645086 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.645121 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.645131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.645145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.645155 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.747058 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.747097 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.747108 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.747123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.747136 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.769560 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.769597 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.800350 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.820060 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.831153 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.843745 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.849300 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.849329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.849337 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.849350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.849359 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.856480 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.869714 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.879365 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.891894 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.905210 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.928082 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.947276 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.952174 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.952226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.952239 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.952261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.952275 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:18Z","lastTransitionTime":"2025-12-02T07:46:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.963223 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.977861 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:18 crc kubenswrapper[4691]: I1202 07:46:18.992601 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:18Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.008498 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.027209 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.055092 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.055555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.055655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.055739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.055844 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.159517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.159569 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.159585 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.159614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.159632 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.262523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.262552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.262561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.262575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.262584 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.367633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.367666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.367677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.367693 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.367704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.470233 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.470283 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.470324 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.470340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.470351 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.572429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.572464 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.572473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.572488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.572497 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.676811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.676897 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.676927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.676967 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.676997 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.779899 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.779951 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.779965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.779986 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.780005 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.781188 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/0.log" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.783874 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66" exitCode=1 Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.783931 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.784845 4691 scope.go:117] "RemoveContainer" containerID="7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.804899 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.820680 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.833345 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.847622 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.857910 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.876627 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.882438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.882492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.882508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.882532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.882546 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.888723 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.904178 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.919097 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.936725 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.951728 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.966630 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.984535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.984576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.984586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.984599 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.984608 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:19Z","lastTransitionTime":"2025-12-02T07:46:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:19 crc kubenswrapper[4691]: I1202 07:46:19.987520 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:19Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:46:19.278026 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:19.278045 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:19.278062 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:46:19.278073 5999 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 07:46:19.278090 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:46:19.278101 5999 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:46:19.278105 5999 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:46:19.278126 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:46:19.278141 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 07:46:19.278148 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:19.278177 5999 factory.go:656] Stopping watch factory\\\\nI1202 07:46:19.278195 5999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:46:19.278223 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 07:46:19.278232 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:46:19.278238 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:19Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.005547 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.017815 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.087398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.087438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.087447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.087463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.087475 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.189847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.189896 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.189910 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.189927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.189939 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.292730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.292810 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.292823 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.292844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.292856 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.395208 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.395249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.395262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.395282 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.395294 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.497479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.497535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.497552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.497609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.497626 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.561519 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.561578 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.561633 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:20 crc kubenswrapper[4691]: E1202 07:46:20.561694 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:20 crc kubenswrapper[4691]: E1202 07:46:20.561841 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:20 crc kubenswrapper[4691]: E1202 07:46:20.561939 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.600622 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.600668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.600677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.600693 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.600702 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.702980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.703021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.703031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.703048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.703058 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.788822 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/0.log" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.792027 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.792422 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.805862 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.805907 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.805917 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.805932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.805943 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.809638 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:19Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:46:19.278026 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:19.278045 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:19.278062 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:46:19.278073 5999 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 07:46:19.278090 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:46:19.278101 5999 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:46:19.278105 5999 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:46:19.278126 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:46:19.278141 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 07:46:19.278148 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:19.278177 5999 factory.go:656] Stopping watch factory\\\\nI1202 07:46:19.278195 5999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:46:19.278223 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 07:46:19.278232 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:46:19.278238 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.823160 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.836912 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.848399 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.860552 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.870799 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.882600 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.893757 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.904262 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.908074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.908102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.908111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.908124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.908134 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:20Z","lastTransitionTime":"2025-12-02T07:46:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.914590 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.923907 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.933041 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.949847 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.963728 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:20 crc kubenswrapper[4691]: I1202 07:46:20.974498 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:20Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.010400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.010437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.010448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.010466 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.010476 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.113265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.113316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.113329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.113347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.113359 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.207013 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.207044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.207054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.207067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.207076 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.217227 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.220730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.220780 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.220793 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.220807 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.220816 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.231590 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.236913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.236960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.236971 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.236989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.237002 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.249123 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.252015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.252070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.252083 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.252099 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.252110 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.262582 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.266495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.266541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.266553 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.266571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.266582 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.278179 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.278296 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.279720 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.279786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.279799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.279817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.279828 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.381754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.382027 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.382100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.382168 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.382231 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.484516 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.484555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.484566 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.484580 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.484592 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.587116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.587160 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.587173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.587192 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.587206 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.689755 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.689812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.689824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.689841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.689855 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.792795 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.792847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.792858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.792872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.792881 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.795423 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/1.log" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.795966 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/0.log" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.798134 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68" exitCode=1 Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.798164 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.798207 4691 scope.go:117] "RemoveContainer" containerID="7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.799651 4691 scope.go:117] "RemoveContainer" containerID="94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68" Dec 02 07:46:21 crc kubenswrapper[4691]: E1202 07:46:21.799915 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.812310 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.825016 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.836683 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.847752 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.857864 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.876383 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.889133 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.895407 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.895444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.895456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.895472 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.895484 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.901682 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.919418 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:19Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:46:19.278026 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:19.278045 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:19.278062 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:46:19.278073 5999 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 07:46:19.278090 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:46:19.278101 5999 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:46:19.278105 5999 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:46:19.278126 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:46:19.278141 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 07:46:19.278148 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:19.278177 5999 factory.go:656] Stopping watch factory\\\\nI1202 07:46:19.278195 5999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:46:19.278223 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 07:46:19.278232 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:46:19.278238 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.920293 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm"] Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.920723 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.922389 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.922674 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.932320 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.943010 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.953227 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.963580 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.977105 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.990292 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:21Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.997079 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.997191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.997380 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.997460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:21 crc kubenswrapper[4691]: I1202 07:46:21.997546 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:21Z","lastTransitionTime":"2025-12-02T07:46:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.003555 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.010667 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc9815d-db42-4be2-b58e-4496dca655de-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.010742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc9815d-db42-4be2-b58e-4496dca655de-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.010792 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc9815d-db42-4be2-b58e-4496dca655de-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.010839 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxfbm\" (UniqueName: \"kubernetes.io/projected/fdc9815d-db42-4be2-b58e-4496dca655de-kube-api-access-wxfbm\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.016998 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.026848 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.038404 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.048801 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.065035 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.075749 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.086348 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.095290 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.099576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.099609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.099620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.099636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.099646 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.106400 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.112008 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc9815d-db42-4be2-b58e-4496dca655de-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.112060 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc9815d-db42-4be2-b58e-4496dca655de-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.112080 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc9815d-db42-4be2-b58e-4496dca655de-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.112111 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxfbm\" (UniqueName: \"kubernetes.io/projected/fdc9815d-db42-4be2-b58e-4496dca655de-kube-api-access-wxfbm\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.112715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fdc9815d-db42-4be2-b58e-4496dca655de-env-overrides\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.113369 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fdc9815d-db42-4be2-b58e-4496dca655de-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.117080 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.118223 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fdc9815d-db42-4be2-b58e-4496dca655de-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.127752 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxfbm\" (UniqueName: \"kubernetes.io/projected/fdc9815d-db42-4be2-b58e-4496dca655de-kube-api-access-wxfbm\") pod \"ovnkube-control-plane-749d76644c-zszqm\" (UID: \"fdc9815d-db42-4be2-b58e-4496dca655de\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.128286 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.140443 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.156648 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:19Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:46:19.278026 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:19.278045 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:19.278062 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:46:19.278073 5999 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 07:46:19.278090 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:46:19.278101 5999 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:46:19.278105 5999 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:46:19.278126 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:46:19.278141 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 07:46:19.278148 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:19.278177 5999 factory.go:656] Stopping watch factory\\\\nI1202 07:46:19.278195 5999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:46:19.278223 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 07:46:19.278232 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:46:19.278238 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.168243 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.178677 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.201845 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.201894 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.201906 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.201922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.201953 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.232861 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" Dec 02 07:46:22 crc kubenswrapper[4691]: W1202 07:46:22.244730 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdc9815d_db42_4be2_b58e_4496dca655de.slice/crio-7a28ba07b9e7b8cedccf9392b37462dcf261d013dc01cd8ab77d3ec20413c091 WatchSource:0}: Error finding container 7a28ba07b9e7b8cedccf9392b37462dcf261d013dc01cd8ab77d3ec20413c091: Status 404 returned error can't find the container with id 7a28ba07b9e7b8cedccf9392b37462dcf261d013dc01cd8ab77d3ec20413c091 Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.303626 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.303669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.303682 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.303702 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.303713 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.405759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.405811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.405821 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.405835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.405844 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.508099 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.508130 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.508140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.508155 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.508166 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.560854 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.560939 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:22 crc kubenswrapper[4691]: E1202 07:46:22.560984 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:22 crc kubenswrapper[4691]: E1202 07:46:22.561067 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.560882 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:22 crc kubenswrapper[4691]: E1202 07:46:22.561153 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.572892 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.586720 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.598322 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.608291 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.610159 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.610185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.610194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.610206 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.610226 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.621106 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.634209 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.657830 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.705592 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.712315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.712359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.712370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.712387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.712417 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.725548 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.735025 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.746686 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.758279 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.770728 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.787641 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7ade89716e213728696429584c5084c4edb55edb847030cbf59fb52e7b7ded66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:19Z\\\",\\\"message\\\":\\\"/pkg/client/informers/externalversions/factory.go:141\\\\nI1202 07:46:19.278026 5999 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:19.278045 5999 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:19.278062 5999 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1202 07:46:19.278073 5999 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1202 07:46:19.278090 5999 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1202 07:46:19.278101 5999 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1202 07:46:19.278105 5999 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1202 07:46:19.278126 5999 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1202 07:46:19.278141 5999 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1202 07:46:19.278148 5999 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:19.278177 5999 factory.go:656] Stopping watch factory\\\\nI1202 07:46:19.278195 5999 ovnkube.go:599] Stopped ovnkube\\\\nI1202 07:46:19.278223 5999 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1202 07:46:19.278232 5999 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1202 07:46:19.278238 5999 handler.go:208] Removed *v1.Node event handler 2\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.800663 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.802712 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" event={"ID":"fdc9815d-db42-4be2-b58e-4496dca655de","Type":"ContainerStarted","Data":"7a28ba07b9e7b8cedccf9392b37462dcf261d013dc01cd8ab77d3ec20413c091"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.804377 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/1.log" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.807426 4691 scope.go:117] "RemoveContainer" containerID="94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68" Dec 02 07:46:22 crc kubenswrapper[4691]: E1202 07:46:22.807593 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.814592 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.814703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.814786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.814849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.814903 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.815251 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.826232 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.839316 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.853531 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.865989 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.876840 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.889309 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.898229 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.915150 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.916332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.916366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.916379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.916392 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.916402 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:22Z","lastTransitionTime":"2025-12-02T07:46:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.926987 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.937681 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.946409 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.959999 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.970915 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.982861 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:22 crc kubenswrapper[4691]: I1202 07:46:22.994299 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:22Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.010842 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.018250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.018340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.018350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.018366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.018384 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.120575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.120616 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.120629 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.120645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.120658 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.222352 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.222391 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.222403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.222420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.222431 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.324815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.324862 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.324873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.324893 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.324905 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.429676 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.429714 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.429725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.429740 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.429750 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.532187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.532228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.532237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.532251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.532261 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.635589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.635667 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.635694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.635725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.635750 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.738701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.738826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.738853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.738884 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.738909 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.811184 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" event={"ID":"fdc9815d-db42-4be2-b58e-4496dca655de","Type":"ContainerStarted","Data":"c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.811231 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" event={"ID":"fdc9815d-db42-4be2-b58e-4496dca655de","Type":"ContainerStarted","Data":"a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.826410 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.841589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.841632 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.841643 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.841660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.841672 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.842404 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.864381 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.878958 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.892944 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.906859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.918865 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.930594 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.944050 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.944083 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.944095 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.944111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.944122 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:23Z","lastTransitionTime":"2025-12-02T07:46:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.946285 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.958465 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.976240 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.988867 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:23 crc kubenswrapper[4691]: I1202 07:46:23.998911 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:23Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.009134 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.019693 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.029404 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.045658 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.045685 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.045696 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.045711 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.045722 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.134708 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8lqps"] Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.135488 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.135579 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.148290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.148341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.148357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.148379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.148393 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.153358 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.166458 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.188366 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.199837 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.212284 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.225493 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.231201 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.231289 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bd6w\" (UniqueName: \"kubernetes.io/projected/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-kube-api-access-2bd6w\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.240959 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.250499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.250542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.250555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.250573 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.250586 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.254467 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.272236 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.286079 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.300283 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.319137 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.331846 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.332050 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.332081 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bd6w\" (UniqueName: \"kubernetes.io/projected/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-kube-api-access-2bd6w\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.332212 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.332270 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:46:24.832253248 +0000 UTC m=+32.616332100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.343786 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.349556 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bd6w\" (UniqueName: \"kubernetes.io/projected/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-kube-api-access-2bd6w\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.352663 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.352709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.352726 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.352746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.352773 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.355517 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.371224 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.381740 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:24Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.456039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.456118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.456137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.456184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.456202 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560143 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560924 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.560995 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.561012 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.561128 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.561284 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.561387 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.662054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.662112 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.662129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.662152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.662169 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.765665 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.766350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.766373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.766403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.766424 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.837341 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.837757 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:24 crc kubenswrapper[4691]: E1202 07:46:24.837873 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:46:25.837845206 +0000 UTC m=+33.621924078 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.869192 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.869264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.869281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.869309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.869328 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.972135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.972175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.972184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.972200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:24 crc kubenswrapper[4691]: I1202 07:46:24.972212 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:24Z","lastTransitionTime":"2025-12-02T07:46:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.076318 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.076422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.076449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.076483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.076508 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.180054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.180119 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.180137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.180166 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.180184 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.283711 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.283809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.283835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.283866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.283886 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.387932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.387993 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.388012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.388040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.388060 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.491873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.491931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.491945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.491963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.491974 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.512671 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.528606 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.544245 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.561507 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:25 crc kubenswrapper[4691]: E1202 07:46:25.561806 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.581472 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.595400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.595450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.595462 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.595480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.595494 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.599954 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.614826 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.629412 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.650532 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.667316 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.692252 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.698648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.698686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.698700 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.698723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.698737 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.715391 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.732662 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.750605 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.770471 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.791949 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.802154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.802221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.802235 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.802259 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.802273 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.809557 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.829265 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.848009 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:25Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.850627 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:25 crc kubenswrapper[4691]: E1202 07:46:25.850917 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:25 crc kubenswrapper[4691]: E1202 07:46:25.851047 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:46:27.851017046 +0000 UTC m=+35.635095918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.905590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.905670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.905704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.905728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:25 crc kubenswrapper[4691]: I1202 07:46:25.905744 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:25Z","lastTransitionTime":"2025-12-02T07:46:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.011046 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.011112 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.011125 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.011149 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.011163 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.115396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.115461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.115482 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.115508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.115524 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.218467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.218517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.218529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.218551 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.218564 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.320998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.321045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.321057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.321077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.321092 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.356914 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357086 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:46:42.357059476 +0000 UTC m=+50.141138338 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.357165 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.357199 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.357237 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357316 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357345 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357363 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357378 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357389 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357404 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:42.357366534 +0000 UTC m=+50.141445466 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357422 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:42.357414355 +0000 UTC m=+50.141493217 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.357439 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:42.357430886 +0000 UTC m=+50.141509748 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.423941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.423980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.423994 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.424011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.424024 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.458117 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.458314 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.458341 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.458354 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.458408 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:46:42.458392779 +0000 UTC m=+50.242471641 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.526752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.526817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.526826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.526839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.526849 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.560558 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.560558 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.560666 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.560830 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.560907 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:26 crc kubenswrapper[4691]: E1202 07:46:26.560964 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.628575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.628624 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.628633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.628648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.628656 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.730668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.730998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.731016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.731030 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.731041 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.833969 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.834055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.834068 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.834088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.834098 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.936855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.936913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.936923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.936941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:26 crc kubenswrapper[4691]: I1202 07:46:26.936953 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:26Z","lastTransitionTime":"2025-12-02T07:46:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.039988 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.040075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.040116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.040140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.040156 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.142262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.142336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.142348 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.142371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.142387 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.245456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.245510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.245525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.245544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.245557 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.349048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.349096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.349112 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.349128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.349138 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.451900 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.452002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.452029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.452066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.452095 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.555294 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.555336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.555346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.555361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.555371 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.561064 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:27 crc kubenswrapper[4691]: E1202 07:46:27.561313 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.656916 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.656950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.656960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.656977 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.656993 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.759531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.759589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.759597 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.759614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.759623 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.861647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.861690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.861710 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.861724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.861732 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.877391 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:27 crc kubenswrapper[4691]: E1202 07:46:27.877504 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:27 crc kubenswrapper[4691]: E1202 07:46:27.877562 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:46:31.877547977 +0000 UTC m=+39.661626839 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.964287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.964328 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.964339 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.964354 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:27 crc kubenswrapper[4691]: I1202 07:46:27.964362 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:27Z","lastTransitionTime":"2025-12-02T07:46:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.066420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.066455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.066464 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.066477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.066487 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.169568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.169624 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.169634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.169651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.169665 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.272861 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.272917 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.272929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.272943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.272953 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.375276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.375326 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.375341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.375362 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.375376 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.478126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.478181 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.478194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.478213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.478229 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.561402 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:28 crc kubenswrapper[4691]: E1202 07:46:28.561525 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.561920 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:28 crc kubenswrapper[4691]: E1202 07:46:28.561977 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.562070 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:28 crc kubenswrapper[4691]: E1202 07:46:28.562220 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.580540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.580571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.580582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.580598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.580610 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.682730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.683137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.683166 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.683184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.683194 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.786574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.786617 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.786628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.786644 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.786654 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.889019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.889321 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.889423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.889548 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.889653 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.992422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.992468 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.992490 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.992508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:28 crc kubenswrapper[4691]: I1202 07:46:28.992517 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:28Z","lastTransitionTime":"2025-12-02T07:46:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.095102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.095163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.095178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.095199 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.095219 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.197179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.197218 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.197226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.197240 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.197250 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.299045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.299100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.299116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.299136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.299149 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.401359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.401394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.401406 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.401421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.401430 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.503736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.503792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.503806 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.503822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.503831 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.561258 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:29 crc kubenswrapper[4691]: E1202 07:46:29.561415 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.606179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.606228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.606238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.606265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.606276 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.708286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.708321 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.708332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.708355 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.708365 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.811657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.811712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.811754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.811788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.811803 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.914565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.914598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.914608 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.914620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:29 crc kubenswrapper[4691]: I1202 07:46:29.914629 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:29Z","lastTransitionTime":"2025-12-02T07:46:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.016956 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.017035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.017057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.017091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.017112 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.119133 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.119185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.119201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.119223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.119238 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.221575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.221609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.221621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.221634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.221645 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.323347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.323679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.323908 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.324017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.324105 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.426094 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.426312 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.426384 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.426475 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.426532 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.529061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.529128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.529138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.529154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.529163 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.560622 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.560653 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:30 crc kubenswrapper[4691]: E1202 07:46:30.560748 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.560825 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:30 crc kubenswrapper[4691]: E1202 07:46:30.561240 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:30 crc kubenswrapper[4691]: E1202 07:46:30.561397 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.632361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.632418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.632437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.632463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.632480 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.735655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.735690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.735699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.735714 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.735723 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.837322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.837363 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.837373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.837388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.837401 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.939774 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.939815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.939824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.939836 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:30 crc kubenswrapper[4691]: I1202 07:46:30.939846 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:30Z","lastTransitionTime":"2025-12-02T07:46:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.042195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.042240 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.042250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.042267 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.042275 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.144938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.145030 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.145055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.145091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.145113 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.247215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.247281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.247301 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.247328 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.247346 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.350231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.350263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.350271 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.350284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.350295 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.452217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.452293 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.452317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.452347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.452372 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.555570 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.555802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.555843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.555876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.555897 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.560486 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.560692 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.561722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.561812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.561840 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.561868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.561889 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.576038 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:31Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.579972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.580017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.580033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.580054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.580069 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.593499 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:31Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.598711 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.598854 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.598872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.598895 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.598915 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.618320 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:31Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.622798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.622868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.622927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.622954 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.622972 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.641585 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:31Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.651717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.651970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.652227 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.652299 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.652368 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.665008 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:31Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.665324 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.666999 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.667088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.667184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.667264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.667329 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.769364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.769609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.769695 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.769812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.769889 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.872891 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.872976 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.872994 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.873015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.873033 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.922485 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.922595 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:31 crc kubenswrapper[4691]: E1202 07:46:31.922652 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:46:39.922637347 +0000 UTC m=+47.706716209 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.974691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.974731 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.974743 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.974787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:31 crc kubenswrapper[4691]: I1202 07:46:31.974805 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:31Z","lastTransitionTime":"2025-12-02T07:46:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.077901 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.077932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.077942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.077957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.077968 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.179989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.180036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.180048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.180067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.180080 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.282558 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.282595 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.282606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.282620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.282631 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.384572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.384605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.384614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.384630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.384641 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.487509 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.487547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.487556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.487571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.487582 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.561499 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.561511 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.561582 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:32 crc kubenswrapper[4691]: E1202 07:46:32.561812 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:32 crc kubenswrapper[4691]: E1202 07:46:32.561882 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:32 crc kubenswrapper[4691]: E1202 07:46:32.561966 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.585633 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.589832 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.589890 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.589906 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.589927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.589943 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.600147 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.617718 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.633726 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.654985 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.666607 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.676800 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.686132 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.692139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.692175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.692187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.692202 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.692211 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.707735 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.721140 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.735509 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.744933 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.758093 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.773091 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.792609 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.793894 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.793921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.793931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.793945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.793956 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.805224 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.817989 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:32Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.896062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.896104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.896117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.896134 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.896149 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.998357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.998397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.998406 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.998420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:32 crc kubenswrapper[4691]: I1202 07:46:32.998430 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:32Z","lastTransitionTime":"2025-12-02T07:46:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.100195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.100273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.100287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.100322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.100332 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.202678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.202727 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.202740 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.202777 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.202791 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.304873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.304906 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.304915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.304928 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.304937 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.408243 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.408311 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.408324 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.408342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.408356 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.511696 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.511748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.511775 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.511793 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.511804 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.560976 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:33 crc kubenswrapper[4691]: E1202 07:46:33.561126 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.613830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.613870 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.613880 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.613897 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.613907 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.716173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.716209 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.716221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.716237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.716249 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.818863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.818939 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.818984 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.819003 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.819015 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.921411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.921457 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.921468 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.921484 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:33 crc kubenswrapper[4691]: I1202 07:46:33.921497 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:33Z","lastTransitionTime":"2025-12-02T07:46:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.024837 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.024887 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.024895 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.024913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.024923 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.127115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.127150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.127159 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.127173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.127199 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.230185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.230257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.230272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.230290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.230338 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.332612 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.332663 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.332677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.332698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.332711 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.435451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.435492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.435504 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.435519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.435529 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.537682 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.537723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.537743 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.537772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.537782 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.561197 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.561238 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:34 crc kubenswrapper[4691]: E1202 07:46:34.561350 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.561366 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:34 crc kubenswrapper[4691]: E1202 07:46:34.561449 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:34 crc kubenswrapper[4691]: E1202 07:46:34.561554 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.640492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.640530 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.640538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.640552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.640561 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.742525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.742564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.742576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.742592 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.742605 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.845322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.845382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.845398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.845420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.845437 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.947744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.947790 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.947800 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.947813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:34 crc kubenswrapper[4691]: I1202 07:46:34.947824 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:34Z","lastTransitionTime":"2025-12-02T07:46:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.050147 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.050202 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.050220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.050245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.050262 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.153401 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.153816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.153983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.154134 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.154265 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.256401 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.256649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.256717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.256821 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.256905 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.358975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.359011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.359021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.359035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.359047 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.460988 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.461031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.461043 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.461058 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.461069 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.561900 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:35 crc kubenswrapper[4691]: E1202 07:46:35.562078 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.562336 4691 scope.go:117] "RemoveContainer" containerID="94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.563197 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.563257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.563280 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.563309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.563327 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.666143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.666175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.666183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.666207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.666215 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.769286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.769613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.769625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.769642 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.769656 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.850776 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/1.log" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.853208 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.853582 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.866619 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.871815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.871849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.871860 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.871874 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.871884 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.878825 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.889221 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.901316 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.912848 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.923905 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.946560 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.961498 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.974180 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.974218 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.974230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.974246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:35 crc kubenswrapper[4691]: I1202 07:46:35.974258 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:35Z","lastTransitionTime":"2025-12-02T07:46:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.033713 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:35Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.056047 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.076073 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.076987 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.077012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.077023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.077038 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.077047 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.087975 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.098098 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.108409 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.120320 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.128991 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.140379 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.178825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.178872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.178882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.178894 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.178903 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.280670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.280703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.280711 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.280725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.280735 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.382474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.382523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.382532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.382544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.382554 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.485142 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.485189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.485220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.485237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.485248 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.561337 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:36 crc kubenswrapper[4691]: E1202 07:46:36.561467 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.561365 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:36 crc kubenswrapper[4691]: E1202 07:46:36.561540 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.561336 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:36 crc kubenswrapper[4691]: E1202 07:46:36.561606 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.587098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.587128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.587137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.587150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.587159 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.689960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.690001 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.690013 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.690029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.690040 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.792650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.792998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.793197 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.793429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.793606 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.857706 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/2.log" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.858209 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/1.log" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.860581 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3" exitCode=1 Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.860634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.860833 4691 scope.go:117] "RemoveContainer" containerID="94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.861338 4691 scope.go:117] "RemoveContainer" containerID="a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3" Dec 02 07:46:36 crc kubenswrapper[4691]: E1202 07:46:36.861525 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.882492 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.896487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.896518 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.896527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.896543 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.896554 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.898178 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.913263 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.925630 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.941103 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.955931 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.970122 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.982245 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.991666 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:36Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.999110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.999145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.999173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.999190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:36 crc kubenswrapper[4691]: I1202 07:46:36.999201 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:36Z","lastTransitionTime":"2025-12-02T07:46:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.002057 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.019506 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.034328 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.049453 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.059353 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.070438 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.086474 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.101066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.101121 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.101134 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.101149 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.101163 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.109995 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://94fbb7fe0d8d21d76b2be9a9a47c83b047a3fcbf277631b22fefda243071ac68\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"message\\\":\\\"opping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961094 6117 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961158 6117 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961178 6117 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1202 07:46:20.961333 6117 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI1202 07:46:20.961386 6117 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1202 07:46:20.961655 6117 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1202 07:46:20.961669 6117 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1202 07:46:20.961698 6117 factory.go:656] Stopping watch factory\\\\nI1202 07:46:20.961708 6117 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1202 07:46:20.961714 6117 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.205152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.205209 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.205225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.205248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.205265 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.308963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.309021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.309039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.309063 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.309079 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.411997 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.412081 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.412104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.412134 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.412156 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.515039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.515078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.515087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.515101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.515110 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.560995 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:37 crc kubenswrapper[4691]: E1202 07:46:37.561191 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.617530 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.617563 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.617574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.617588 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.617599 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.720804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.720839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.720850 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.720866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.720880 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.823900 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.823958 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.823972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.823989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.824001 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.868650 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/2.log" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.873716 4691 scope.go:117] "RemoveContainer" containerID="a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3" Dec 02 07:46:37 crc kubenswrapper[4691]: E1202 07:46:37.873925 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.892065 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.911823 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.926717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.926787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.926815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.926834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.926846 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:37Z","lastTransitionTime":"2025-12-02T07:46:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.934322 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.946861 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.962555 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.979471 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:37 crc kubenswrapper[4691]: I1202 07:46:37.993122 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:37Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.014571 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.025474 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.028715 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.028741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.028774 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.028794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.028807 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.037619 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.048033 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.059865 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.069534 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.078685 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.088683 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.100646 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.115882 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:38Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.131388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.131553 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.131634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.131732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.131830 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.234507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.234732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.234815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.234892 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.234982 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.338012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.338052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.338065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.338082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.338094 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.441225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.441565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.441805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.442050 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.442234 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.548567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.548660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.548684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.548708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.548811 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.561174 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.561224 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:38 crc kubenswrapper[4691]: E1202 07:46:38.561319 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:38 crc kubenswrapper[4691]: E1202 07:46:38.561492 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.561744 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:38 crc kubenswrapper[4691]: E1202 07:46:38.562181 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.651387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.651452 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.651475 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.651520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.651542 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.756990 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.757070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.757091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.757113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.757130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.859894 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.859937 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.859946 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.859961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.859970 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.963435 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.963492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.963509 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.963538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:38 crc kubenswrapper[4691]: I1202 07:46:38.963554 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:38Z","lastTransitionTime":"2025-12-02T07:46:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.066793 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.066875 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.066895 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.066929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.066960 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.170124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.170173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.170185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.170207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.170222 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.273296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.273358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.273376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.273403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.273421 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.376789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.376839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.376853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.376868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.376879 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.479499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.479554 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.479567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.479586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.479598 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.560510 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:39 crc kubenswrapper[4691]: E1202 07:46:39.560941 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.582508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.582581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.582607 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.582636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.582657 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.685071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.685136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.685154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.685179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.685196 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.788863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.788954 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.788975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.789002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.789019 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.892188 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.892260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.892284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.892313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.892333 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.995290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.995349 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.995371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.995396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:39 crc kubenswrapper[4691]: I1202 07:46:39.995416 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:39Z","lastTransitionTime":"2025-12-02T07:46:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.008891 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:40 crc kubenswrapper[4691]: E1202 07:46:40.009156 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:40 crc kubenswrapper[4691]: E1202 07:46:40.009328 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:46:56.009293521 +0000 UTC m=+63.793372423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.098539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.098611 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.098631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.098664 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.098687 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.202040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.202116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.202136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.202163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.202180 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.306172 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.306245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.306268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.306306 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.306339 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.409355 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.409403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.409417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.409439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.409452 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.512366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.512426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.512441 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.512464 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.512477 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.561404 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.561543 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.561626 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:40 crc kubenswrapper[4691]: E1202 07:46:40.561551 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:40 crc kubenswrapper[4691]: E1202 07:46:40.561803 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:40 crc kubenswrapper[4691]: E1202 07:46:40.561860 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.615504 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.615587 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.615605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.615628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.615640 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.718976 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.719016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.719029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.719048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.719063 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.821129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.821203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.821216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.821234 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.821248 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.924832 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.924886 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.924902 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.924923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:40 crc kubenswrapper[4691]: I1202 07:46:40.924934 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:40Z","lastTransitionTime":"2025-12-02T07:46:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.027009 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.027045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.027059 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.027077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.027089 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.129844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.130187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.130373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.130504 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.130620 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.234048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.234124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.234143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.234165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.234176 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.337350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.337398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.337416 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.337439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.337456 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.439828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.439871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.439883 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.439910 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.439923 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.542073 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.542105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.542115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.542127 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.542137 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.561536 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:41 crc kubenswrapper[4691]: E1202 07:46:41.561671 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.644973 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.645023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.645036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.645055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.645066 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.747154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.747193 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.747206 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.747222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.747233 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.851052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.851144 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.851169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.851611 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.851633 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.939215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.939322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.939342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.939376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.939396 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: E1202 07:46:41.954789 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.959885 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.959931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.959945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.959962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.960006 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: E1202 07:46:41.974596 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.979179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.979215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.979228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.979279 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.979291 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:41 crc kubenswrapper[4691]: E1202 07:46:41.994001 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:41Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.998517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.998570 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.998582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.998606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:41 crc kubenswrapper[4691]: I1202 07:46:41.998619 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:41Z","lastTransitionTime":"2025-12-02T07:46:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.012850 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.017020 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.017077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.017095 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.017117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.017130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.030210 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.030358 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.031949 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.031984 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.031998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.032015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.032027 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.134798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.134839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.134851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.134869 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.134881 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.237512 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.237564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.237583 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.237607 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.237622 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.339986 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.340022 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.340031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.340045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.340053 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.434399 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.434531 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434596 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:47:14.434571324 +0000 UTC m=+82.218650186 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434619 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.434671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434702 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:47:14.434689757 +0000 UTC m=+82.218768609 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.434719 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434849 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434874 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434886 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434896 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434897 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:47:14.434885262 +0000 UTC m=+82.218964184 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.434929 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:47:14.434920963 +0000 UTC m=+82.218999825 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.441750 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.441816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.441824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.441856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.441868 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.535127 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.535271 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.535284 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.535295 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.535334 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:47:14.535322375 +0000 UTC m=+82.319401237 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.543883 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.543963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.543984 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.544037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.544055 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.561411 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.561442 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.561809 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.561470 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.562339 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:42 crc kubenswrapper[4691]: E1202 07:46:42.562042 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.581569 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.598469 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.614828 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.629161 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.645650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.645990 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.646317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.646499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.646666 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.649965 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.665076 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.678571 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.694027 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.706398 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.720345 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.734833 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.750353 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.750394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.750403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.750418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.750428 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.760534 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.777190 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.792895 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.810878 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.823787 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.837143 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:42Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.853572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.854351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.854500 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.854636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.854815 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.958172 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.958498 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.958603 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.958712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:42 crc kubenswrapper[4691]: I1202 07:46:42.958847 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:42Z","lastTransitionTime":"2025-12-02T07:46:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.061660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.061712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.061733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.061774 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.061790 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.165175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.165255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.165277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.165308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.165329 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.268415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.268450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.268460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.268474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.268485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.370140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.370189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.370199 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.370216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.370232 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.473539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.473782 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.473891 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.473957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.474022 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.561580 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:43 crc kubenswrapper[4691]: E1202 07:46:43.561802 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.576236 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.576281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.576292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.576307 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.576316 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.678581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.678616 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.678625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.678640 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.678649 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.781861 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.781928 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.781946 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.781970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.781988 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.884351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.884402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.884410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.884425 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.884434 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.987057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.987108 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.987117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.987130 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:43 crc kubenswrapper[4691]: I1202 07:46:43.987141 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:43Z","lastTransitionTime":"2025-12-02T07:46:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.089843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.089919 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.089942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.089972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.089995 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.192618 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.192673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.192693 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.192718 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.192735 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.295632 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.295704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.295724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.295749 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.295793 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.398736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.399231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.399511 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.399672 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.399839 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.461100 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.476032 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.484461 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.502004 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.502395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.502448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.502465 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.502482 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.502496 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.514542 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.527654 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.536456 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.551267 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.560654 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.560713 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:44 crc kubenswrapper[4691]: E1202 07:46:44.560881 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.560916 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:44 crc kubenswrapper[4691]: E1202 07:46:44.561058 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:44 crc kubenswrapper[4691]: E1202 07:46:44.561201 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.580637 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.597706 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.605101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.605173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.605191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.605215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.605234 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.611576 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.635021 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.653810 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.671778 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.689782 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.704550 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.707589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.707640 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.707659 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.707688 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.707706 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.728030 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.744221 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.763467 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:44Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.810783 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.810834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.810847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.810867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.810878 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.912537 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.912668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.912688 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.912712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:44 crc kubenswrapper[4691]: I1202 07:46:44.912729 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:44Z","lastTransitionTime":"2025-12-02T07:46:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.016265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.016352 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.016379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.016411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.016434 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.119650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.119722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.119742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.119828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.119869 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.221913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.221961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.221973 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.221991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.222003 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.324822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.324882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.324893 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.324910 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.324921 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.427815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.427850 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.427858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.427871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.427880 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.530706 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.530786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.530802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.530825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.530839 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.560827 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:45 crc kubenswrapper[4691]: E1202 07:46:45.561049 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.634175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.634448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.634552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.634633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.634728 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.737262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.737578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.737684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.737905 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.738081 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.840375 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.840738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.840986 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.841137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.841269 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.943264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.943952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.943985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.944010 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:45 crc kubenswrapper[4691]: I1202 07:46:45.944026 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:45Z","lastTransitionTime":"2025-12-02T07:46:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.046964 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.047005 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.047015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.047030 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.047038 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.149215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.149288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.149300 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.149317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.149326 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.252284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.252347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.252363 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.252382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.252394 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.355113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.355155 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.355167 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.355184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.355195 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.457995 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.458045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.458054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.458066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.458074 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560155 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560211 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560273 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560616 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560683 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.560748 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:46 crc kubenswrapper[4691]: E1202 07:46:46.560860 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:46 crc kubenswrapper[4691]: E1202 07:46:46.560968 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:46 crc kubenswrapper[4691]: E1202 07:46:46.561049 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.662090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.662136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.662145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.662164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.662180 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.765033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.765080 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.765091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.765108 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.765122 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.867499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.867697 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.867802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.867876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.867951 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.970363 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.970402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.970413 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.970429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:46 crc kubenswrapper[4691]: I1202 07:46:46.970440 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:46Z","lastTransitionTime":"2025-12-02T07:46:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.072572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.072618 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.072630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.072662 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.072675 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.175556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.175682 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.175704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.175729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.175748 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.277877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.277942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.277965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.277993 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.278014 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.380690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.380735 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.380748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.380791 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.380805 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.482895 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.482941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.482953 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.482969 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.482980 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.560498 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:47 crc kubenswrapper[4691]: E1202 07:46:47.560652 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.585237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.585284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.585294 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.585308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.585317 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.687996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.688048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.688071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.688101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.688123 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.791021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.791079 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.791115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.791145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.791167 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.893177 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.893232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.893241 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.893256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.893265 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.995966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.996007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.996018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.996036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:47 crc kubenswrapper[4691]: I1202 07:46:47.996047 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:47Z","lastTransitionTime":"2025-12-02T07:46:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.098382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.098448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.098463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.098482 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.098492 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.204834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.204871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.204883 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.204901 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.204913 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.307245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.307281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.307293 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.307309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.307322 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.409506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.409552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.409565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.409584 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.409595 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.512084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.512127 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.512138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.512154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.512165 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.561177 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.561271 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:48 crc kubenswrapper[4691]: E1202 07:46:48.561425 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.561495 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:48 crc kubenswrapper[4691]: E1202 07:46:48.561613 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:48 crc kubenswrapper[4691]: E1202 07:46:48.561799 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.614484 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.614564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.614609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.614647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.614672 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.718506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.718586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.718601 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.718619 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.718634 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.821580 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.821619 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.821630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.821651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.821663 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.924915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.924981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.924998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.925023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:48 crc kubenswrapper[4691]: I1202 07:46:48.925044 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:48Z","lastTransitionTime":"2025-12-02T07:46:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.028853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.028940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.028961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.029000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.029024 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.132442 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.132638 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.132652 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.132669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.132681 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.236724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.236842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.236866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.236895 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.236915 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.340595 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.340673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.340698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.340737 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.340794 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.443517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.443559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.443570 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.443590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.443602 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.546241 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.546302 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.546315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.546338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.546350 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.561550 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:49 crc kubenswrapper[4691]: E1202 07:46:49.561745 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.649383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.649444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.649458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.649480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.649494 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.753107 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.753185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.753204 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.753232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.753249 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.856391 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.856444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.856459 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.856481 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.856505 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.960180 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.960249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.960269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.960300 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:49 crc kubenswrapper[4691]: I1202 07:46:49.960323 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:49Z","lastTransitionTime":"2025-12-02T07:46:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.063119 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.063178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.063190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.063207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.063224 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.166316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.166358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.166367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.166383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.166395 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.268409 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.268468 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.268477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.268489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.268520 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.370929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.370983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.370997 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.371015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.371031 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.474298 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.474395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.474411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.474428 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.474442 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.561349 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.561357 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.561595 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:50 crc kubenswrapper[4691]: E1202 07:46:50.561595 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:50 crc kubenswrapper[4691]: E1202 07:46:50.561703 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:50 crc kubenswrapper[4691]: E1202 07:46:50.561748 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.562367 4691 scope.go:117] "RemoveContainer" containerID="a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3" Dec 02 07:46:50 crc kubenswrapper[4691]: E1202 07:46:50.562507 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.576578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.576620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.576633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.576649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.576670 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.679471 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.679508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.679517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.679534 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.679544 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.786256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.786298 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.786308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.786323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.786335 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.888246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.888291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.888303 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.888318 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.888331 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.991225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.991296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.991308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.991330 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:50 crc kubenswrapper[4691]: I1202 07:46:50.991346 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:50Z","lastTransitionTime":"2025-12-02T07:46:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.093786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.093852 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.093873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.093902 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.093921 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.197216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.197297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.197310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.197333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.197348 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.299315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.299355 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.299365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.299400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.299409 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.402188 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.402691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.402703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.402720 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.402733 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.505439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.505712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.505842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.505920 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.505997 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.560630 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:51 crc kubenswrapper[4691]: E1202 07:46:51.560845 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.609767 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.609814 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.609825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.609845 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.609861 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.712439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.712489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.712503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.712525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.712539 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.815701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.815744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.815770 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.815787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.815798 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.919609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.919650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.919661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.919677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:51 crc kubenswrapper[4691]: I1202 07:46:51.919689 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:51Z","lastTransitionTime":"2025-12-02T07:46:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.022498 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.022547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.022567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.022586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.022596 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.125297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.125972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.126016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.126048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.126071 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.198249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.198314 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.198332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.198355 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.198374 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.212813 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.217113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.217153 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.217167 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.217183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.217197 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.229940 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.234220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.234255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.234267 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.234284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.234297 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.247573 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.251392 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.251441 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.251456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.251477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.251495 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.265216 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.269102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.269135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.269148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.269164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.269175 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.281280 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.281391 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.282828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.282854 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.282863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.282877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.282885 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.385477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.385521 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.385530 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.385545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.385554 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.487941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.488296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.488380 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.488463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.488542 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.561352 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.561405 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.561409 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.561460 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.561622 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:52 crc kubenswrapper[4691]: E1202 07:46:52.561676 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.576675 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.589751 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.591090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.591127 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.591139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.591157 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.591169 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.603839 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.619878 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.634977 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.649640 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.662646 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.674162 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.691741 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.693448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.693494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.693539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.693559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.693572 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.704096 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.713633 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.722418 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.732878 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.742394 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.753875 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.773341 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.786648 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.795816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.795877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.795889 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.795912 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.795924 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.805432 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:52Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.898423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.898469 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.898481 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.898502 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:52 crc kubenswrapper[4691]: I1202 07:46:52.898515 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:52Z","lastTransitionTime":"2025-12-02T07:46:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.001613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.001671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.001684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.001708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.001723 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.105400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.105454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.105466 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.105483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.105495 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.209273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.209329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.209343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.209364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.209376 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.312752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.312847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.312863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.312887 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.312906 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.415868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.415904 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.415913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.415927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.415936 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.518470 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.518519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.518531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.518547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.518558 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.560685 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:53 crc kubenswrapper[4691]: E1202 07:46:53.560856 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.620729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.620799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.620813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.620831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.620845 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.724024 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.724104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.724120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.724150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.724166 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.827379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.827455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.827474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.827503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.827524 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.930712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.930748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.930776 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.930791 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:53 crc kubenswrapper[4691]: I1202 07:46:53.930801 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:53Z","lastTransitionTime":"2025-12-02T07:46:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.033745 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.033805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.033816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.033830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.033839 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.135651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.135708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.135720 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.135738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.135749 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.238339 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.238386 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.238395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.238409 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.238418 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.342805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.342884 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.342915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.342947 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.342970 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.446110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.446169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.446182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.446201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.446213 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.549346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.549394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.549406 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.549424 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.549443 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.560511 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.560548 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:54 crc kubenswrapper[4691]: E1202 07:46:54.560638 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.560661 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:54 crc kubenswrapper[4691]: E1202 07:46:54.560799 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:54 crc kubenswrapper[4691]: E1202 07:46:54.560902 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.653164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.653214 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.653231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.653253 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.653272 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.755848 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.755893 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.755904 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.755921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.755933 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.858152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.858208 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.858223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.858238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.858247 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.960290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.960333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.960343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.960358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:54 crc kubenswrapper[4691]: I1202 07:46:54.960369 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:54Z","lastTransitionTime":"2025-12-02T07:46:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.063421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.063465 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.063474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.063488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.063498 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.165752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.165826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.165836 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.165852 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.165863 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.268007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.268054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.268065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.268084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.268094 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.370100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.370158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.370176 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.370201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.370227 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.473401 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.473485 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.473522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.473560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.473587 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.561138 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:55 crc kubenswrapper[4691]: E1202 07:46:55.561295 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.575957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.576010 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.576026 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.576051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.576069 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.679164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.679233 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.679253 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.679283 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.679311 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.782004 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.782063 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.782077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.782096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.782107 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.884480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.884514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.884522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.884537 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.884546 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.987205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.987248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.987256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.987273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:55 crc kubenswrapper[4691]: I1202 07:46:55.987282 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:55Z","lastTransitionTime":"2025-12-02T07:46:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.052129 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:56 crc kubenswrapper[4691]: E1202 07:46:56.052412 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:56 crc kubenswrapper[4691]: E1202 07:46:56.052481 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:47:28.052464137 +0000 UTC m=+95.836542999 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.089292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.089340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.089349 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.089362 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.089373 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.192038 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.192102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.192113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.192135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.192147 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.295865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.295920 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.295931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.295948 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.295957 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.399898 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.399937 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.399948 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.399962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.399972 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.502636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.502719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.502737 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.502803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.502826 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.560626 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.560730 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.560811 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:56 crc kubenswrapper[4691]: E1202 07:46:56.560876 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:56 crc kubenswrapper[4691]: E1202 07:46:56.561009 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:56 crc kubenswrapper[4691]: E1202 07:46:56.561148 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.606101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.606153 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.606165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.606185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.606196 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.708640 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.708679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.708689 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.708703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.708712 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.811144 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.811190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.811201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.811216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.811229 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.912985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.913013 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.913022 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.913034 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.913044 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:56Z","lastTransitionTime":"2025-12-02T07:46:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.934340 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/0.log" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.934388 4691 generic.go:334] "Generic (PLEG): container finished" podID="eb6171dd-c2ea-4c52-b906-e8a9a7ff6537" containerID="73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82" exitCode=1 Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.934417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerDied","Data":"73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82"} Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.934903 4691 scope.go:117] "RemoveContainer" containerID="73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.945957 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.957112 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.970267 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:56 crc kubenswrapper[4691]: I1202 07:46:56.990711 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:56Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.005303 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.014911 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.014950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.014961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.014976 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.014985 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.016689 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.028922 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.042047 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.053793 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.064889 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.078665 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.087423 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.097061 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.117109 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.120321 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.120357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.120366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.120383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.120392 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.150367 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.174502 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.185422 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.195729 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.223062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.223093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.223103 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.223116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.223125 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.325057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.325095 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.325105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.325120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.325129 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.427647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.427694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.427708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.427730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.427743 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.529341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.529397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.529417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.529441 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.529458 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.561413 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:57 crc kubenswrapper[4691]: E1202 07:46:57.561603 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.631728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.631782 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.631794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.631809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.631820 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.734407 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.734463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.734478 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.734494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.734504 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.837133 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.837178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.837189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.837205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.837216 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.939976 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/0.log" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.940025 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerStarted","Data":"0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.941139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.941333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.941370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.941403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.941429 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:57Z","lastTransitionTime":"2025-12-02T07:46:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.959948 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.978223 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:57 crc kubenswrapper[4691]: I1202 07:46:57.990516 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.000069 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:57Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.010311 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.018859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.027554 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.037524 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.043722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.043792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.043805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.043825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.043837 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.050808 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.065058 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.080942 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.092584 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.103893 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.115212 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.126250 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.138242 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.147214 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.147259 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.147269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.147285 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.147297 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.156496 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.166099 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:46:58Z is after 2025-08-24T17:21:41Z" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.249867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.249917 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.249927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.249944 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.249958 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.352035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.352067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.352077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.352092 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.352100 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.453813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.453856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.453872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.453893 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.453903 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.556343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.556378 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.556389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.556421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.556430 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.560584 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.560655 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.560600 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:46:58 crc kubenswrapper[4691]: E1202 07:46:58.560695 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:46:58 crc kubenswrapper[4691]: E1202 07:46:58.560776 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:46:58 crc kubenswrapper[4691]: E1202 07:46:58.560833 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.658906 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.658950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.658962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.658979 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.658989 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.761779 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.761809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.761820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.761835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.761847 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.864802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.864855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.864878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.864906 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.864924 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.967195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.967233 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.967246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.967265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:58 crc kubenswrapper[4691]: I1202 07:46:58.967276 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:58Z","lastTransitionTime":"2025-12-02T07:46:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.069157 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.069195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.069207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.069225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.069237 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.171699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.171744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.171778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.171798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.171809 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.274492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.274537 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.274546 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.274562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.274573 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.376475 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.376514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.376526 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.376543 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.376555 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.482361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.482410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.482421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.482439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.482453 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.561011 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:46:59 crc kubenswrapper[4691]: E1202 07:46:59.561174 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.584879 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.584937 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.584948 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.584968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.584980 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.687613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.687660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.687673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.687690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.687703 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.789658 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.789698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.789708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.789723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.789733 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.891856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.891898 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.891908 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.891923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.891934 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.994097 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.994146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.994160 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.994176 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:46:59 crc kubenswrapper[4691]: I1202 07:46:59.994190 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:46:59Z","lastTransitionTime":"2025-12-02T07:46:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.097237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.097329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.097341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.097359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.097371 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.199855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.199915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.199933 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.199955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.199971 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.302398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.302423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.302432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.302446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.302454 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.404730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.404777 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.404787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.404801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.404810 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.507624 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.507669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.507680 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.507698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.507711 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.561044 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.561087 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:00 crc kubenswrapper[4691]: E1202 07:47:00.561216 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.561447 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:00 crc kubenswrapper[4691]: E1202 07:47:00.561544 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:00 crc kubenswrapper[4691]: E1202 07:47:00.561828 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.610230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.610270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.610280 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.610295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.610303 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.712700 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.712746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.712775 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.712794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.712808 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.816079 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.816143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.816161 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.816185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.816202 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.921588 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.921658 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.921671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.921691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:00 crc kubenswrapper[4691]: I1202 07:47:00.921704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:00Z","lastTransitionTime":"2025-12-02T07:47:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.023849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.023882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.023892 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.023905 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.023914 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.126266 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.126300 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.126313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.126328 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.126338 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.228569 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.228596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.228604 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.228617 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.228625 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.331412 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.331438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.331446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.331460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.331471 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.434067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.434116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.434125 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.434141 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.434149 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.535884 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.535921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.535932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.535947 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.535957 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.561007 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:01 crc kubenswrapper[4691]: E1202 07:47:01.561113 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.572875 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.638834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.638873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.638882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.638896 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.638905 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.741356 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.741394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.741403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.741418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.741428 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.843822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.843867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.843880 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.843899 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.843932 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.946556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.946625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.946639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.946657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:01 crc kubenswrapper[4691]: I1202 07:47:01.946669 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:01Z","lastTransitionTime":"2025-12-02T07:47:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.048591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.048638 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.048650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.048667 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.048678 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.150534 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.150571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.150581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.150618 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.150669 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.252548 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.252586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.252595 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.252609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.252621 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.296855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.296892 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.296904 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.296922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.296934 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.308616 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.311996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.312047 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.312057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.312071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.312087 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.327902 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.331055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.331088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.331098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.331114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.331124 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.343503 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.347371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.347408 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.347419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.347434 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.347463 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.361443 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.364755 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.364809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.364828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.364847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.364860 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.381793 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.381917 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.383789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.383824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.383834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.383848 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.383859 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.486674 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.486713 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.486724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.486740 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.486751 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.561543 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.561613 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.561539 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.561679 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.561781 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:02 crc kubenswrapper[4691]: E1202 07:47:02.561840 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.577969 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.589642 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d4eb39-936d-4598-95c9-52c800fefc1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7970d6804ddbc7e241b8f7d14ac1508db5cc75ba5ff7654a8d5f378c16e498ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.589744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.589800 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.589811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.589829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.589839 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.601247 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.612820 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.626563 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.637401 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.647021 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.665915 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.677407 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.687503 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.691971 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.692010 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.692019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.692036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.692045 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.696723 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.706931 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.715985 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.725734 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.738035 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.749598 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.766310 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.778850 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.789882 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:02Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.794394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.794525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.794621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.794707 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.794801 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.896804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.896846 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.896858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.896877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.896888 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.998506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.998819 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.998927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.999032 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:02 crc kubenswrapper[4691]: I1202 07:47:02.999131 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:02Z","lastTransitionTime":"2025-12-02T07:47:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.101683 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.101803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.101867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.101890 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.101913 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.204684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.204753 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.204778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.204795 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.204807 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.306627 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.306668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.306685 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.306701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.306711 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.409354 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.409407 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.409419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.409437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.409450 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.511683 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.511729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.511741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.511770 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.511779 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.560708 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:03 crc kubenswrapper[4691]: E1202 07:47:03.560880 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.613669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.613713 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.613725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.613742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.613752 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.716240 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.716279 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.716287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.716304 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.716313 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.818012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.818054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.818065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.818082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.818093 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.919824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.919868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.919881 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.919898 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:03 crc kubenswrapper[4691]: I1202 07:47:03.919910 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:03Z","lastTransitionTime":"2025-12-02T07:47:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.022258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.022303 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.022315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.022335 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.022350 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.124808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.124842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.124853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.124868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.124878 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.227265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.227309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.227336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.227351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.227360 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.329295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.329338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.329353 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.329370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.329382 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.431659 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.431690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.431699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.431712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.431724 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.534851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.534898 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.534911 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.534927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.534940 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.561676 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.561857 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.562706 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:04 crc kubenswrapper[4691]: E1202 07:47:04.562920 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:04 crc kubenswrapper[4691]: E1202 07:47:04.563052 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:04 crc kubenswrapper[4691]: E1202 07:47:04.563176 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.563505 4691 scope.go:117] "RemoveContainer" containerID="a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.636699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.636743 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.636772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.636794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.636807 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.739388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.739412 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.739419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.739432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.739440 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.841816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.841850 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.841863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.841880 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.841893 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.943751 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.943847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.943860 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.943877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:04 crc kubenswrapper[4691]: I1202 07:47:04.943889 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:04Z","lastTransitionTime":"2025-12-02T07:47:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.046161 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.046212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.046230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.046249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.046261 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.148656 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.148687 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.148695 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.148709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.148718 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.250952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.251002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.251013 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.251029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.251057 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.354538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.354590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.354602 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.354619 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.354631 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.457188 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.457226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.457237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.457252 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.457261 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.559683 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.559752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.559784 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.559804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.559817 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.560955 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:05 crc kubenswrapper[4691]: E1202 07:47:05.561135 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.661996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.662041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.662054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.662071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.662084 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.764514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.764567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.764579 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.764597 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.764611 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.866630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.866686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.866699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.866716 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.866728 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.962384 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/2.log" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.964536 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.964926 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.968282 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.968314 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.968346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.968365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.968377 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:05Z","lastTransitionTime":"2025-12-02T07:47:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.975193 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:05Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.983316 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:05Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:05 crc kubenswrapper[4691]: I1202 07:47:05.990633 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:05Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.000905 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:05Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.018664 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.034811 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.047248 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.068210 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:47:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.070261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.070315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.070332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.070354 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.070366 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.083093 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.098728 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.112931 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.127821 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.142372 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.154140 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.170998 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.172421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.172447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.172457 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.172470 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.172479 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.186050 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.200480 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.211859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d4eb39-936d-4598-95c9-52c800fefc1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7970d6804ddbc7e241b8f7d14ac1508db5cc75ba5ff7654a8d5f378c16e498ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.225716 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.275345 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.275429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.275445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.275465 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.275476 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.378593 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.378652 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.378668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.378692 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.378706 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.481174 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.481216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.481226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.481242 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.481251 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.561060 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:06 crc kubenswrapper[4691]: E1202 07:47:06.561191 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.561307 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.561433 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:06 crc kubenswrapper[4691]: E1202 07:47:06.561470 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:06 crc kubenswrapper[4691]: E1202 07:47:06.561650 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.584329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.584394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.584407 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.584427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.584441 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.686733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.686780 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.686790 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.686804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.686813 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.788794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.788850 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.788869 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.788890 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.788907 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.892970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.893024 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.893038 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.893058 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.893077 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.968906 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/3.log" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.969605 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/2.log" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.972370 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" exitCode=1 Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.972408 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4"} Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.972443 4691 scope.go:117] "RemoveContainer" containerID="a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.974130 4691 scope.go:117] "RemoveContainer" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" Dec 02 07:47:06 crc kubenswrapper[4691]: E1202 07:47:06.974369 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.989222 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:06Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.995226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.995269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.995278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.995292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:06 crc kubenswrapper[4691]: I1202 07:47:06.995304 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:06Z","lastTransitionTime":"2025-12-02T07:47:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.007402 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.022874 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.034251 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d4eb39-936d-4598-95c9-52c800fefc1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7970d6804ddbc7e241b8f7d14ac1508db5cc75ba5ff7654a8d5f378c16e498ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.045882 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.057335 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.074982 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.085498 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.097283 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.097322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.097332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.097346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.097357 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.110852 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.124134 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.134674 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.145270 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.157596 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.170258 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.186703 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.199315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.199367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.199576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.199601 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.199618 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.200593 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.213726 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.229809 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.254142 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a46ff24826aea6b1bdc5c958dfc2391cb0907cc5ebe57b795076d8731dd5bfe3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:36Z\\\",\\\"message\\\":\\\"76644c-zszqm\\\\nI1202 07:46:36.423012 6348 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI1202 07:46:36.423058 6348 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-6gcsh after 0 failed attempt(s)\\\\nI1202 07:46:36.423074 6348 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-6gcsh\\\\nI1202 07:46:36.423073 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI1202 07:46:36.423082 6348 obj_retry.go:303] Retry object setup: *v1.Pod openshift-etcd/etcd-crc\\\\nF1202 07:46:36.423086 6348 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifica\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:35Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:47:06Z\\\",\\\"message\\\":\\\"s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 07:47:06.122332 6709 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 07:47:06.122340 6709 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=default: []services.LB{}\\\\nI1202 07:47:06.122353 6709 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1202 07:47:06.110584 6709 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:47:06.122656 6709 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:47:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.302225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.302271 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.302281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.302297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.302307 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.405257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.405299 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.405307 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.405321 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.405332 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.507808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.507865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.507879 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.507900 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.507915 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.561292 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:07 crc kubenswrapper[4691]: E1202 07:47:07.561420 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.610047 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.610109 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.610123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.610139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.610150 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.712409 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.712448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.712460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.712474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.712483 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.815149 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.815191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.815220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.815237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.815246 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.919218 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.919251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.919260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.919274 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.919283 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:07Z","lastTransitionTime":"2025-12-02T07:47:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.978720 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/3.log" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.982399 4691 scope.go:117] "RemoveContainer" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" Dec 02 07:47:07 crc kubenswrapper[4691]: E1202 07:47:07.982662 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:47:07 crc kubenswrapper[4691]: I1202 07:47:07.994952 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:07Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.006980 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.019163 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.023151 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.023195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.023284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.023339 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.023353 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.039130 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:47:06Z\\\",\\\"message\\\":\\\"s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 07:47:06.122332 6709 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 07:47:06.122340 6709 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=default: []services.LB{}\\\\nI1202 07:47:06.122353 6709 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1202 07:47:06.110584 6709 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:47:06.122656 6709 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:47:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.051683 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.064033 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.080987 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.093170 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.105556 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.114483 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d4eb39-936d-4598-95c9-52c800fefc1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7970d6804ddbc7e241b8f7d14ac1508db5cc75ba5ff7654a8d5f378c16e498ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.125713 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.126212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.126250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.126261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.126275 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.126286 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.137579 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.147407 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.156999 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.167076 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.191563 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.202632 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.213967 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.222056 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:08Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.228666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.228686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.228694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.228707 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.228717 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.331551 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.331618 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.331639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.331666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.331689 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.434839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.434891 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.434913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.434943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.434965 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.537467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.537716 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.537842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.537981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.538077 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.561134 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.561195 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.561292 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:08 crc kubenswrapper[4691]: E1202 07:47:08.561282 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:08 crc kubenswrapper[4691]: E1202 07:47:08.561400 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:08 crc kubenswrapper[4691]: E1202 07:47:08.561505 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.640991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.641035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.641047 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.641064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.641074 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.743991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.744234 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.744302 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.744369 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.744445 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.847025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.847068 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.847084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.847101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.847113 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.949018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.949066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.949078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.949096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:08 crc kubenswrapper[4691]: I1202 07:47:08.949107 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:08Z","lastTransitionTime":"2025-12-02T07:47:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.051578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.051645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.051750 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.051889 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.051917 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.155217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.155268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.155284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.155315 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.155354 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.257947 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.258015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.258057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.258090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.258114 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.360378 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.360451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.360469 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.360492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.360508 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.463178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.463258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.463280 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.463309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.463330 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.561586 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:09 crc kubenswrapper[4691]: E1202 07:47:09.561818 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.565389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.565420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.565431 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.565447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.565461 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.667884 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.667928 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.667937 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.667951 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.667961 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.770944 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.771009 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.771033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.771059 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.771214 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.879847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.879924 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.879950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.879981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.880003 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.983031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.983138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.983172 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.983203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:09 crc kubenswrapper[4691]: I1202 07:47:09.983225 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:09Z","lastTransitionTime":"2025-12-02T07:47:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.087078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.087127 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.087139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.087156 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.087167 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.189839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.189913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.189931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.189957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.189978 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.292362 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.292403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.292421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.292440 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.292455 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.395662 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.395745 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.395803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.395841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.396265 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.499444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.499500 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.499513 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.499536 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.499550 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.560723 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.560816 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.560914 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:10 crc kubenswrapper[4691]: E1202 07:47:10.561001 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:10 crc kubenswrapper[4691]: E1202 07:47:10.561245 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:10 crc kubenswrapper[4691]: E1202 07:47:10.561287 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.602691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.602742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.602753 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.602793 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.602806 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.705800 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.705859 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.705870 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.705894 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.705910 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.809207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.809256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.809269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.809291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.809306 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.913194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.913274 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.913296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.913327 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:10 crc kubenswrapper[4691]: I1202 07:47:10.913350 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:10Z","lastTransitionTime":"2025-12-02T07:47:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.016340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.016396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.016411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.016431 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.016453 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.119993 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.120090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.120115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.120161 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.120187 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.224748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.224871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.224899 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.224931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.224951 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.326888 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.326948 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.326966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.326989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.327004 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.429934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.430007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.430017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.430031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.430040 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.533041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.533098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.533111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.533132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.533148 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.561156 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:11 crc kubenswrapper[4691]: E1202 07:47:11.561372 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.636589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.636654 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.636666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.636685 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.636696 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.740059 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.740131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.740152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.740185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.740210 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.842582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.842636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.842646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.842661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.842670 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.945803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.946280 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.946372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.946506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:11 crc kubenswrapper[4691]: I1202 07:47:11.946641 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:11Z","lastTransitionTime":"2025-12-02T07:47:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.049810 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.049872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.049885 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.049907 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.049920 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.152528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.152581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.152801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.152819 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.152830 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.255337 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.255364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.255373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.255385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.255394 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.357676 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.357719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.357728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.357741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.357749 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.462708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.463013 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.463134 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.463178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.463193 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.561582 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.561667 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.561818 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.561848 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.562208 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.562423 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.565724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.565975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.566110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.566250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.566792 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.569641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.569725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.569741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.569779 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.569795 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.574369 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"80d4eb39-936d-4598-95c9-52c800fefc1e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7970d6804ddbc7e241b8f7d14ac1508db5cc75ba5ff7654a8d5f378c16e498ff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da31937f137f04fe5d98563a438a55fe2799a0312ac273f04fbd2627181297bd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.584144 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.586115 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.593011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.593195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.593286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.593351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.593417 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.596916 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.607131 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.611427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.611468 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.611478 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.611496 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.611510 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.611818 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5ce5053b-1d3d-4bc9-9b65-a38112c18218\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f66711602ecd6c506d3099610055f588465f989c3c77dc86d7d03b6a93e2ee8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f47cd6e07bd854573a13d49fbab12a8b969bbf12c6aa5ab61835f8a262ff94e4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dd0261287443d6610a952bbf68dace5f9642269aa603c273a24a7137114b8c73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dcf8cb523637cafcdc3a0df7b237fac925b515e4189de632e2c1723f37eb82b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6e9d2e1bedba315f62abee191f6dbb6ae9ce71c0cee1e23c8b38eb3384ce7859\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8f7b38916d95ea694bd866a9237e1b0342cb4e74e903d46d85fc8d0c96029e1b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e0d5a195bf0ccffb22e3e8ec3422e166a5959102ce56b656666f83fbb31c4d5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x8c92\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-rs5s2\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.623802 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fdc9815d-db42-4be2-b58e-4496dca655de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a95e2990740de41682df69c0b829b1d32e75343f7f7c6ea7104917bd81976708\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c09ed393ea7fd4242a497ec345352b392a15f98de97147e28308bbf000aecc65\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-wxfbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:21Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-zszqm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.623845 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.627387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.627418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.627428 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.627460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.627470 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.638207 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfac83b6-bc8a-4d98-994c-4f982f3e0b4d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"mespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:09.966996 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1202 07:46:09.968018 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1084161854/tls.crt::/tmp/serving-cert-1084161854/tls.key\\\\\\\"\\\\nI1202 07:46:10.263427 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1202 07:46:10.265592 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1202 07:46:10.265611 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1202 07:46:10.265630 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1202 07:46:10.265645 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1202 07:46:10.270027 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1202 07:46:10.270046 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270051 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1202 07:46:10.270054 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1202 07:46:10.270058 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1202 07:46:10.270061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1202 07:46:10.270063 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI1202 07:46:10.270148 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF1202 07:46:10.272034 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1202 07:46:10.272466 1 requestheader_controller.go:172] Starting RequestHeaderAuthRequestController\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.639730 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.643329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.643362 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.643373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.643387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.643397 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.651192 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.658198 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-02T07:47:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d3569d90-2e1c-4c42-8376-d393c2ea01f6\\\",\\\"systemUUID\\\":\\\"594c318f-f07e-4852-8352-7483c8d3d991\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: E1202 07:47:12.658364 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.669820 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8891f6ef8e1184f52805dbb9bfc8c1ec037da50d0cb56b39cd89ba2c3b9d3fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.669896 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.669918 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.669926 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.669959 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.669968 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.680153 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-v26sg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"be5dfd05-e2d4-460a-93e2-5e138f0dc58c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://137273fc0485e08a7eea3156ed054cf97183ed056fd7ffdbab96d5fe24498f43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-hwn7b\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-v26sg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.690795 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"82103e10-1127-4a84-b5fc-9d0d6a259932\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c9eefe9aef820e6c52ff00b0fb531b0c7315e925f7a423674d364d2b6570a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-psldk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-mgbt6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.698807 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-rg26n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0707e5b7-0f43-45e6-a9c9-af60cbbe31de\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e33451b37676a7a316864f268a7b548808d177049879e00d6b0eeb96138f53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-strhx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:12Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-rg26n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.707288 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8lqps" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:24Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2bd6w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:24Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8lqps\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.725827 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2a5cf5ea-1c35-45c1-b544-78edd43d966b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0201cc4bab6a0c63d66982e430bf888410cf9b359536a08cb0c2b2ee8f59ac6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff0a914c40a91e6944329cd4771a004a1c0a0be8305f750f8a3f6c209e4d133b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://be0b906f14063dc63b51976b1c1e878186ca352827e928cdadfbcfe67fc8d19e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://583a285f96f4b6fb09b3db55b8e1431660b0e5099284b4e8537e15d290f40819\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dcff29409815c92facc140ec531913e91b18fbb3eea2e9702ac844de7f467f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d3af3fa23235487f87967517ebede8d98bcbc2cdb733dca3dfb8c18c517b4c8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7813dc6b16645df8afa1a82f2871ae7be839c239fca7174f2b14a37d097374ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62f5462e14e7eb5e0173f21b3889b0cbcdf4e861c36af87f0075199c0cc00e94\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.736154 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dcc9d3d6-3429-47e6-881a-bf29db5eec30\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66354a0743bc597474c4fec93fbc366ab5ab9971235492ed7f27abb8cc9c5948\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53cd0cc1bbdad9d265d555e176a8424c58d23b9df345317f108a54533e634d82\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a43367c28afaa794d45d1c138be1fb9a53b19ddb9d68c149761d4f5ed37700ef\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.748302 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2f264d7a0cd2efc7cd4f5d15e319ecf35b47bda285fabf991e6d43b8d308fff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.763412 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3605748c-8980-4aa9-8d28-f18a17aa8124\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:47:06Z\\\",\\\"message\\\":\\\"s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI1202 07:47:06.122332 6709 services_controller.go:452] Built service openshift-kube-controller-manager/kube-controller-manager per-node LB for network=default: []services.LB{}\\\\nI1202 07:47:06.122340 6709 services_controller.go:453] Built service openshift-kube-controller-manager/kube-controller-manager template LB for network=default: []services.LB{}\\\\nI1202 07:47:06.122353 6709 services_controller.go:454] Service openshift-kube-controller-manager/kube-controller-manager for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1202 07:47:06.110584 6709 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1202 07:47:06.122656 6709 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:47:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8q22\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-7pgxh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.772583 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.772614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.772623 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.772636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.772646 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.775854 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a1b45c56-bd4e-4cb9-bef7-55abe7ddef5f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:45:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://001ae01619f89617dcf8704bacccb92d0c42ea23b8e935b170139d7a204a0ba9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3496e0007711c8e371e0f7fdcc4907faab2b72487290947fa2060054478a7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://830344d995bf21e3650a9d520bea20b313abffebd4891ff834259c7f3fa6f11d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:45:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://93657bd6734bf0f72904c07abc21c4f4a108c432689e7e71b4bfec9dc42c3201\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-02T07:45:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-02T07:45:53Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:45:52Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.787473 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6gcsh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-02T07:46:56Z\\\",\\\"message\\\":\\\"2025-12-02T07:46:11+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369\\\\n2025-12-02T07:46:11+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_531ac2d0-0b6e-4176-993a-128ab44b0369 to /host/opt/cni/bin/\\\\n2025-12-02T07:46:11Z [verbose] multus-daemon started\\\\n2025-12-02T07:46:11Z [verbose] Readiness Indicator file check\\\\n2025-12-02T07:46:56Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-drlmn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-02T07:46:10Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6gcsh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.798080 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-02T07:46:11Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3479ac86544849c314bedb8693e51aee151ba4a954c17b723e8db9b531c706cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://49c7e11e591d4e0f1250c954e249b6ed4f5f70f9e0cd0c9cd80329c0d3d49083\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-02T07:46:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-02T07:47:12Z is after 2025-08-24T17:21:41Z" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.875342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.875384 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.875396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.875410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.875420 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.977670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.977883 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.977974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.978039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:12 crc kubenswrapper[4691]: I1202 07:47:12.978124 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:12Z","lastTransitionTime":"2025-12-02T07:47:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.080677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.080839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.080854 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.080874 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.080888 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.183619 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.183644 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.183653 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.183665 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.183673 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.286423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.286487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.286497 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.286514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.286522 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.389309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.389350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.389358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.389371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.389379 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.490995 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.491520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.491530 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.491546 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.491556 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.560864 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:13 crc kubenswrapper[4691]: E1202 07:47:13.561009 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.593741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.593803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.593812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.593825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.593833 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.695978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.696681 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.696824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.696922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.696999 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.799109 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.799190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.799200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.799215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.799225 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.901494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.901520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.901529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.901543 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:13 crc kubenswrapper[4691]: I1202 07:47:13.901560 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:13Z","lastTransitionTime":"2025-12-02T07:47:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.003606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.003875 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.003966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.004114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.004220 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.107448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.107484 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.107493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.107508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.107519 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.210031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.210511 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.210680 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.210905 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.211066 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.314250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.314491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.314724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.314842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.315017 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.418128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.418393 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.418487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.418576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.418661 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.437465 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.437603 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.437623 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.437646 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.437752 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.437782 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.437794 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.437834 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:18.437821827 +0000 UTC m=+146.221900689 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.437973 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:18.437963791 +0000 UTC m=+146.222042653 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.438013 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.438039 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:18.438033973 +0000 UTC m=+146.222112835 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.438297 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.438452 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:18.438434493 +0000 UTC m=+146.222513355 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.521265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.521318 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.521329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.521346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.521356 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.539091 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.539372 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.539864 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.540103 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.540860 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:18.540817749 +0000 UTC m=+146.324896661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.561031 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.561134 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.561213 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.561271 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.561335 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:14 crc kubenswrapper[4691]: E1202 07:47:14.561423 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.626890 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.626961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.626985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.627016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.627037 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.729040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.729085 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.729102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.729118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.729130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.831927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.832458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.832540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.832886 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.832992 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.935862 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.936116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.936192 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.936265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:14 crc kubenswrapper[4691]: I1202 07:47:14.936345 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:14Z","lastTransitionTime":"2025-12-02T07:47:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.039026 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.039092 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.039105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.039126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.039141 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.142246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.142286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.142300 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.142325 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.142348 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.244877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.244938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.244958 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.244978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.244992 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.347404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.347435 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.347446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.347458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.347467 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.449703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.449746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.449773 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.449789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.449799 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.551878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.552210 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.552287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.552360 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.552442 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.561257 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:15 crc kubenswrapper[4691]: E1202 07:47:15.561510 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.655142 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.655222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.655240 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.655265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.655283 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.757727 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.757804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.757817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.757838 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.757849 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.860024 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.860060 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.860072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.860086 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.860096 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.962744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.962801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.962811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.962827 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:15 crc kubenswrapper[4691]: I1202 07:47:15.962840 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:15Z","lastTransitionTime":"2025-12-02T07:47:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.065057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.065114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.065131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.065153 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.065170 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.167878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.167935 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.167946 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.167963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.167975 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.270046 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.270091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.270102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.270118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.270130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.372223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.372263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.372274 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.372290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.372300 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.474179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.474243 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.474256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.474271 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.474304 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.560532 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.560605 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.560548 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:16 crc kubenswrapper[4691]: E1202 07:47:16.560755 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:16 crc kubenswrapper[4691]: E1202 07:47:16.560954 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:16 crc kubenswrapper[4691]: E1202 07:47:16.561055 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.576433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.576463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.576473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.576487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.576498 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.678877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.678904 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.678913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.678927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.678935 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.781879 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.781915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.781926 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.781942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.781955 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.884533 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.884570 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.884579 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.884592 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.884601 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.987266 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.987297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.987306 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.987333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:16 crc kubenswrapper[4691]: I1202 07:47:16.987341 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:16Z","lastTransitionTime":"2025-12-02T07:47:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.089655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.089690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.089699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.089712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.089722 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.192256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.192470 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.192577 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.192668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.192806 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.294970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.295328 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.295529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.295742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.295986 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.398789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.398840 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.398852 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.398867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.398876 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.500583 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.500902 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.500938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.500960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.500974 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.561366 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:17 crc kubenswrapper[4691]: E1202 07:47:17.561555 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.603110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.603363 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.603453 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.603542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.603658 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.705980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.706061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.706091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.706115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.706133 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.808602 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.808650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.808658 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.808673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.808684 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.910645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.910750 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.910776 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.910792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:17 crc kubenswrapper[4691]: I1202 07:47:17.910803 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:17Z","lastTransitionTime":"2025-12-02T07:47:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.012217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.012251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.012263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.012276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.012285 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.115004 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.115053 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.115063 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.115078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.115089 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.217708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.217822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.217841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.217871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.217892 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.319869 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.319911 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.319922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.319940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.319952 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.422222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.422273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.422297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.422320 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.422334 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.525216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.525262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.525277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.525291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.525303 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.560498 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.560558 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.560581 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:18 crc kubenswrapper[4691]: E1202 07:47:18.561085 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:18 crc kubenswrapper[4691]: E1202 07:47:18.560974 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:18 crc kubenswrapper[4691]: E1202 07:47:18.561051 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.627627 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.627665 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.627677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.627693 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.627704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.730181 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.730219 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.730229 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.730244 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.730257 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.832655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.833252 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.833357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.833383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.833401 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.936081 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.936123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.936132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.936146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:18 crc kubenswrapper[4691]: I1202 07:47:18.936157 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:18Z","lastTransitionTime":"2025-12-02T07:47:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.038899 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.038938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.038949 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.038965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.039001 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.141177 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.141240 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.141251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.141265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.141274 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.243102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.243138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.243146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.243161 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.243172 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.345496 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.345532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.345544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.345559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.345569 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.447910 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.447954 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.447965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.447980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.447992 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.550202 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.550242 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.550255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.550270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.550280 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.560459 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:19 crc kubenswrapper[4691]: E1202 07:47:19.560584 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.653458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.653517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.653535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.653560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.653577 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.756447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.756490 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.756505 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.756523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.756535 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.858560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.858598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.858606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.858619 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.858629 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.961224 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.961268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.961277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.961291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:19 crc kubenswrapper[4691]: I1202 07:47:19.961301 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:19Z","lastTransitionTime":"2025-12-02T07:47:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.063420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.063463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.063474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.063492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.063502 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.165817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.165844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.165853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.165865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.165873 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.268226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.268265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.268273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.268288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.268297 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.370239 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.370281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.370292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.370306 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.370316 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.473318 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.473367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.473382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.473417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.473437 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.561079 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.561148 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:20 crc kubenswrapper[4691]: E1202 07:47:20.561192 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.561202 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:20 crc kubenswrapper[4691]: E1202 07:47:20.561277 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:20 crc kubenswrapper[4691]: E1202 07:47:20.561428 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.562173 4691 scope.go:117] "RemoveContainer" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" Dec 02 07:47:20 crc kubenswrapper[4691]: E1202 07:47:20.562356 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.576024 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.576055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.576064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.576076 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.576087 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.678742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.678806 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.678817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.678864 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.678877 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.781814 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.781859 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.781877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.781896 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.781907 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.883909 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.884476 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.884544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.884610 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.884685 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.987555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.987848 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.987943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.988041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:20 crc kubenswrapper[4691]: I1202 07:47:20.988134 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:20Z","lastTransitionTime":"2025-12-02T07:47:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.090220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.090260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.090271 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.090287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.090298 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.192179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.192262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.192281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.192299 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.192312 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.295922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.295962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.295971 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.295988 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.295999 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.399666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.399704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.399714 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.399736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.399749 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.502846 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.502917 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.502931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.502954 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.502969 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.560707 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:21 crc kubenswrapper[4691]: E1202 07:47:21.560877 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.605385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.605460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.605470 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.605484 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.605493 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.707744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.707795 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.707806 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.707820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.707830 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.810132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.810183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.810195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.810210 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.810220 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.912212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.912262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.912272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.912284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:21 crc kubenswrapper[4691]: I1202 07:47:21.912292 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:21Z","lastTransitionTime":"2025-12-02T07:47:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.018657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.018701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.018714 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.018734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.018747 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.121412 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.121463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.121480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.121501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.121517 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.224444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.224477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.224488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.224503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.224513 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.326753 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.326820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.326830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.326846 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.326857 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.429273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.429323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.429336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.429358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.429370 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.532039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.532284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.532374 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.532445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.532516 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.560796 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.560890 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.561750 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:22 crc kubenswrapper[4691]: E1202 07:47:22.561853 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:22 crc kubenswrapper[4691]: E1202 07:47:22.561988 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:22 crc kubenswrapper[4691]: E1202 07:47:22.562102 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.600968 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.600951031 podStartE2EDuration="1m10.600951031s" podCreationTimestamp="2025-12-02 07:46:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.588472324 +0000 UTC m=+90.372551186" watchObservedRunningTime="2025-12-02 07:47:22.600951031 +0000 UTC m=+90.385029893" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.634986 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.635211 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.635302 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.635376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.635439 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.638230 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v26sg" podStartSLOduration=72.638218499 podStartE2EDuration="1m12.638218499s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.623267037 +0000 UTC m=+90.407345909" watchObservedRunningTime="2025-12-02 07:47:22.638218499 +0000 UTC m=+90.422297361" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.654136 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-rg26n" podStartSLOduration=72.654124446 podStartE2EDuration="1m12.654124446s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.652668298 +0000 UTC m=+90.436747160" watchObservedRunningTime="2025-12-02 07:47:22.654124446 +0000 UTC m=+90.438203308" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.654753 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podStartSLOduration=72.654743072 podStartE2EDuration="1m12.654743072s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.638572818 +0000 UTC m=+90.422651730" watchObservedRunningTime="2025-12-02 07:47:22.654743072 +0000 UTC m=+90.438821934" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.676359 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.676340229 podStartE2EDuration="38.676340229s" podCreationTimestamp="2025-12-02 07:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.676120003 +0000 UTC m=+90.460198895" watchObservedRunningTime="2025-12-02 07:47:22.676340229 +0000 UTC m=+90.460419091" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.689070 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.689050132 podStartE2EDuration="1m7.689050132s" podCreationTimestamp="2025-12-02 07:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.688526559 +0000 UTC m=+90.472605461" watchObservedRunningTime="2025-12-02 07:47:22.689050132 +0000 UTC m=+90.473128994" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.738437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.738487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.738501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.738521 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.738533 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.759383 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6gcsh" podStartSLOduration=72.759361337 podStartE2EDuration="1m12.759361337s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.759284305 +0000 UTC m=+90.543363167" watchObservedRunningTime="2025-12-02 07:47:22.759361337 +0000 UTC m=+90.543440199" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.784894 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.784877307 podStartE2EDuration="1m12.784877307s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.775769538 +0000 UTC m=+90.559848400" watchObservedRunningTime="2025-12-02 07:47:22.784877307 +0000 UTC m=+90.568956169" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.785274 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=21.785268537 podStartE2EDuration="21.785268537s" podCreationTimestamp="2025-12-02 07:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.784210789 +0000 UTC m=+90.568289651" watchObservedRunningTime="2025-12-02 07:47:22.785268537 +0000 UTC m=+90.569347409" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.824930 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rs5s2" podStartSLOduration=72.824915407 podStartE2EDuration="1m12.824915407s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.824344262 +0000 UTC m=+90.608423144" watchObservedRunningTime="2025-12-02 07:47:22.824915407 +0000 UTC m=+90.608994269" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.840743 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.840817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.840830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.840865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.840876 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.982544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.982577 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.982586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.982600 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.982611 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.996651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.996686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.996694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.996707 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 02 07:47:22 crc kubenswrapper[4691]: I1202 07:47:22.996716 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-02T07:47:22Z","lastTransitionTime":"2025-12-02T07:47:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.037845 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-zszqm" podStartSLOduration=73.037827483 podStartE2EDuration="1m13.037827483s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:22.836215513 +0000 UTC m=+90.620294385" watchObservedRunningTime="2025-12-02 07:47:23.037827483 +0000 UTC m=+90.821906355" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.038356 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc"] Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.038750 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.040397 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.040588 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.040927 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.042839 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.081935 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.081977 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.082007 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.082057 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.082130 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.182894 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.182936 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.182969 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.183014 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.183035 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.183042 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.183173 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.183961 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-service-ca\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.195860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.200080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-fmhqc\" (UID: \"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.353847 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" Dec 02 07:47:23 crc kubenswrapper[4691]: I1202 07:47:23.560589 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:23 crc kubenswrapper[4691]: E1202 07:47:23.561139 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:24 crc kubenswrapper[4691]: I1202 07:47:24.027941 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" event={"ID":"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d","Type":"ContainerStarted","Data":"0f010a0976968f8b7e82c18fdd28fbc1674e16a9a15c4ea46eeee677634fde2c"} Dec 02 07:47:24 crc kubenswrapper[4691]: I1202 07:47:24.028018 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" event={"ID":"95f6c49c-cca3-4e8d-bb9f-34032b0cbf7d","Type":"ContainerStarted","Data":"a2a878a37253071871ed8deb6a171a1e40ae5b0275d7e814b8f79e5511393cc4"} Dec 02 07:47:24 crc kubenswrapper[4691]: I1202 07:47:24.047707 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-fmhqc" podStartSLOduration=74.047683137 podStartE2EDuration="1m14.047683137s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:24.04246713 +0000 UTC m=+91.826546002" watchObservedRunningTime="2025-12-02 07:47:24.047683137 +0000 UTC m=+91.831761999" Dec 02 07:47:24 crc kubenswrapper[4691]: I1202 07:47:24.560918 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:24 crc kubenswrapper[4691]: E1202 07:47:24.561357 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:24 crc kubenswrapper[4691]: I1202 07:47:24.561061 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:24 crc kubenswrapper[4691]: E1202 07:47:24.561453 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:24 crc kubenswrapper[4691]: I1202 07:47:24.560989 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:24 crc kubenswrapper[4691]: E1202 07:47:24.561515 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:25 crc kubenswrapper[4691]: I1202 07:47:25.561219 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:25 crc kubenswrapper[4691]: E1202 07:47:25.561408 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:26 crc kubenswrapper[4691]: I1202 07:47:26.561395 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:26 crc kubenswrapper[4691]: I1202 07:47:26.561421 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:26 crc kubenswrapper[4691]: E1202 07:47:26.561581 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:26 crc kubenswrapper[4691]: E1202 07:47:26.561700 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:26 crc kubenswrapper[4691]: I1202 07:47:26.561425 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:26 crc kubenswrapper[4691]: E1202 07:47:26.561847 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:27 crc kubenswrapper[4691]: I1202 07:47:27.561677 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:27 crc kubenswrapper[4691]: E1202 07:47:27.562892 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:28 crc kubenswrapper[4691]: I1202 07:47:28.073568 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:28 crc kubenswrapper[4691]: E1202 07:47:28.073825 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:47:28 crc kubenswrapper[4691]: E1202 07:47:28.073960 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs podName:b30f2d1f-53a1-4e87-819d-1e20bf3ed92a nodeName:}" failed. No retries permitted until 2025-12-02 07:48:32.073899296 +0000 UTC m=+159.857978198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs") pod "network-metrics-daemon-8lqps" (UID: "b30f2d1f-53a1-4e87-819d-1e20bf3ed92a") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 02 07:47:28 crc kubenswrapper[4691]: I1202 07:47:28.561557 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:28 crc kubenswrapper[4691]: E1202 07:47:28.561698 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:28 crc kubenswrapper[4691]: I1202 07:47:28.561708 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:28 crc kubenswrapper[4691]: E1202 07:47:28.561804 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:28 crc kubenswrapper[4691]: I1202 07:47:28.561580 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:28 crc kubenswrapper[4691]: E1202 07:47:28.561874 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:29 crc kubenswrapper[4691]: I1202 07:47:29.561367 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:29 crc kubenswrapper[4691]: E1202 07:47:29.561514 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:30 crc kubenswrapper[4691]: I1202 07:47:30.561050 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:30 crc kubenswrapper[4691]: I1202 07:47:30.561570 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:30 crc kubenswrapper[4691]: E1202 07:47:30.561908 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:30 crc kubenswrapper[4691]: I1202 07:47:30.561181 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:30 crc kubenswrapper[4691]: E1202 07:47:30.562381 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:30 crc kubenswrapper[4691]: E1202 07:47:30.562482 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:31 crc kubenswrapper[4691]: I1202 07:47:31.560690 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:31 crc kubenswrapper[4691]: E1202 07:47:31.560890 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:32 crc kubenswrapper[4691]: I1202 07:47:32.560965 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:32 crc kubenswrapper[4691]: I1202 07:47:32.561034 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:32 crc kubenswrapper[4691]: E1202 07:47:32.561960 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:32 crc kubenswrapper[4691]: I1202 07:47:32.561971 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:32 crc kubenswrapper[4691]: E1202 07:47:32.562018 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:32 crc kubenswrapper[4691]: E1202 07:47:32.562067 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:33 crc kubenswrapper[4691]: I1202 07:47:33.561506 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:33 crc kubenswrapper[4691]: E1202 07:47:33.561911 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:34 crc kubenswrapper[4691]: I1202 07:47:34.561535 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:34 crc kubenswrapper[4691]: I1202 07:47:34.561603 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:34 crc kubenswrapper[4691]: E1202 07:47:34.562415 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:34 crc kubenswrapper[4691]: I1202 07:47:34.562489 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:34 crc kubenswrapper[4691]: E1202 07:47:34.562536 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:34 crc kubenswrapper[4691]: E1202 07:47:34.562858 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:34 crc kubenswrapper[4691]: I1202 07:47:34.563338 4691 scope.go:117] "RemoveContainer" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" Dec 02 07:47:34 crc kubenswrapper[4691]: E1202 07:47:34.563518 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-7pgxh_openshift-ovn-kubernetes(3605748c-8980-4aa9-8d28-f18a17aa8124)\"" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" Dec 02 07:47:35 crc kubenswrapper[4691]: I1202 07:47:35.560800 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:35 crc kubenswrapper[4691]: E1202 07:47:35.560939 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:36 crc kubenswrapper[4691]: I1202 07:47:36.560880 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:36 crc kubenswrapper[4691]: E1202 07:47:36.561002 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:36 crc kubenswrapper[4691]: I1202 07:47:36.561188 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:36 crc kubenswrapper[4691]: E1202 07:47:36.561379 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:36 crc kubenswrapper[4691]: I1202 07:47:36.561843 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:36 crc kubenswrapper[4691]: E1202 07:47:36.561936 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:37 crc kubenswrapper[4691]: I1202 07:47:37.560632 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:37 crc kubenswrapper[4691]: E1202 07:47:37.561013 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:38 crc kubenswrapper[4691]: I1202 07:47:38.561185 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:38 crc kubenswrapper[4691]: I1202 07:47:38.561185 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:38 crc kubenswrapper[4691]: I1202 07:47:38.561375 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:38 crc kubenswrapper[4691]: E1202 07:47:38.561489 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:38 crc kubenswrapper[4691]: E1202 07:47:38.561920 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:38 crc kubenswrapper[4691]: E1202 07:47:38.561722 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:39 crc kubenswrapper[4691]: I1202 07:47:39.561231 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:39 crc kubenswrapper[4691]: E1202 07:47:39.561371 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:40 crc kubenswrapper[4691]: I1202 07:47:40.560550 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:40 crc kubenswrapper[4691]: I1202 07:47:40.560616 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:40 crc kubenswrapper[4691]: E1202 07:47:40.560692 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:40 crc kubenswrapper[4691]: E1202 07:47:40.560778 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:40 crc kubenswrapper[4691]: I1202 07:47:40.560839 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:40 crc kubenswrapper[4691]: E1202 07:47:40.560988 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:41 crc kubenswrapper[4691]: I1202 07:47:41.560643 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:41 crc kubenswrapper[4691]: E1202 07:47:41.560917 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:42 crc kubenswrapper[4691]: I1202 07:47:42.560746 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:42 crc kubenswrapper[4691]: I1202 07:47:42.560789 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:42 crc kubenswrapper[4691]: I1202 07:47:42.560863 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:42 crc kubenswrapper[4691]: E1202 07:47:42.561995 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:42 crc kubenswrapper[4691]: E1202 07:47:42.562111 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:42 crc kubenswrapper[4691]: E1202 07:47:42.562350 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.090948 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/1.log" Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.091286 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/0.log" Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.091322 4691 generic.go:334] "Generic (PLEG): container finished" podID="eb6171dd-c2ea-4c52-b906-e8a9a7ff6537" containerID="0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722" exitCode=1 Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.091353 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerDied","Data":"0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722"} Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.091392 4691 scope.go:117] "RemoveContainer" containerID="73e55d5e3b217d8ca7dd97cdcb08935e183362b40c74acb58a239f22b59bda82" Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.091797 4691 scope.go:117] "RemoveContainer" containerID="0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722" Dec 02 07:47:43 crc kubenswrapper[4691]: E1202 07:47:43.091954 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6gcsh_openshift-multus(eb6171dd-c2ea-4c52-b906-e8a9a7ff6537)\"" pod="openshift-multus/multus-6gcsh" podUID="eb6171dd-c2ea-4c52-b906-e8a9a7ff6537" Dec 02 07:47:43 crc kubenswrapper[4691]: I1202 07:47:43.561384 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:43 crc kubenswrapper[4691]: E1202 07:47:43.561505 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:44 crc kubenswrapper[4691]: I1202 07:47:44.096053 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/1.log" Dec 02 07:47:44 crc kubenswrapper[4691]: I1202 07:47:44.560865 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:44 crc kubenswrapper[4691]: I1202 07:47:44.560939 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:44 crc kubenswrapper[4691]: I1202 07:47:44.560878 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:44 crc kubenswrapper[4691]: E1202 07:47:44.561032 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:44 crc kubenswrapper[4691]: E1202 07:47:44.561116 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:44 crc kubenswrapper[4691]: E1202 07:47:44.561224 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:45 crc kubenswrapper[4691]: I1202 07:47:45.561199 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:45 crc kubenswrapper[4691]: E1202 07:47:45.561335 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:46 crc kubenswrapper[4691]: I1202 07:47:46.561416 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:46 crc kubenswrapper[4691]: E1202 07:47:46.561563 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:46 crc kubenswrapper[4691]: I1202 07:47:46.561808 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:46 crc kubenswrapper[4691]: E1202 07:47:46.561884 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:46 crc kubenswrapper[4691]: I1202 07:47:46.562103 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:46 crc kubenswrapper[4691]: E1202 07:47:46.562287 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:47 crc kubenswrapper[4691]: I1202 07:47:47.560641 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:47 crc kubenswrapper[4691]: E1202 07:47:47.561320 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:48 crc kubenswrapper[4691]: I1202 07:47:48.561041 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:48 crc kubenswrapper[4691]: I1202 07:47:48.561126 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:48 crc kubenswrapper[4691]: E1202 07:47:48.561173 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:48 crc kubenswrapper[4691]: I1202 07:47:48.561278 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:48 crc kubenswrapper[4691]: E1202 07:47:48.561438 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:48 crc kubenswrapper[4691]: E1202 07:47:48.561536 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:49 crc kubenswrapper[4691]: I1202 07:47:49.561344 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:49 crc kubenswrapper[4691]: E1202 07:47:49.561495 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:49 crc kubenswrapper[4691]: I1202 07:47:49.562106 4691 scope.go:117] "RemoveContainer" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.120626 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/3.log" Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.123515 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerStarted","Data":"0a5db1288676acd22e6bdb6f14c182c8bb9c93429b8c8b433d96b27c2b5e544e"} Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.157822 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podStartSLOduration=100.157805094 podStartE2EDuration="1m40.157805094s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:47:50.15689865 +0000 UTC m=+117.940977512" watchObservedRunningTime="2025-12-02 07:47:50.157805094 +0000 UTC m=+117.941883956" Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.409010 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8lqps"] Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.409141 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:50 crc kubenswrapper[4691]: E1202 07:47:50.409233 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.561593 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.561661 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:50 crc kubenswrapper[4691]: I1202 07:47:50.561673 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:50 crc kubenswrapper[4691]: E1202 07:47:50.561778 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:50 crc kubenswrapper[4691]: E1202 07:47:50.561941 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:50 crc kubenswrapper[4691]: E1202 07:47:50.562079 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:52 crc kubenswrapper[4691]: I1202 07:47:52.560620 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:52 crc kubenswrapper[4691]: I1202 07:47:52.560620 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:52 crc kubenswrapper[4691]: I1202 07:47:52.560660 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:52 crc kubenswrapper[4691]: I1202 07:47:52.560687 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:52 crc kubenswrapper[4691]: E1202 07:47:52.562487 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:52 crc kubenswrapper[4691]: E1202 07:47:52.562550 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:52 crc kubenswrapper[4691]: E1202 07:47:52.562597 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:52 crc kubenswrapper[4691]: E1202 07:47:52.562677 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:52 crc kubenswrapper[4691]: E1202 07:47:52.572646 4691 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 02 07:47:52 crc kubenswrapper[4691]: E1202 07:47:52.654282 4691 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:47:54 crc kubenswrapper[4691]: I1202 07:47:54.560910 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:54 crc kubenswrapper[4691]: I1202 07:47:54.560979 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:54 crc kubenswrapper[4691]: I1202 07:47:54.560997 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:54 crc kubenswrapper[4691]: E1202 07:47:54.561073 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:54 crc kubenswrapper[4691]: I1202 07:47:54.561105 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:54 crc kubenswrapper[4691]: E1202 07:47:54.561212 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:54 crc kubenswrapper[4691]: E1202 07:47:54.561324 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:54 crc kubenswrapper[4691]: E1202 07:47:54.561372 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:56 crc kubenswrapper[4691]: I1202 07:47:56.561458 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:56 crc kubenswrapper[4691]: I1202 07:47:56.561546 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:56 crc kubenswrapper[4691]: I1202 07:47:56.561459 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:56 crc kubenswrapper[4691]: E1202 07:47:56.561595 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:56 crc kubenswrapper[4691]: I1202 07:47:56.561549 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:56 crc kubenswrapper[4691]: E1202 07:47:56.561908 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:56 crc kubenswrapper[4691]: E1202 07:47:56.561973 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:56 crc kubenswrapper[4691]: E1202 07:47:56.561991 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:47:57 crc kubenswrapper[4691]: I1202 07:47:57.561753 4691 scope.go:117] "RemoveContainer" containerID="0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722" Dec 02 07:47:57 crc kubenswrapper[4691]: E1202 07:47:57.655748 4691 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 02 07:47:58 crc kubenswrapper[4691]: I1202 07:47:58.146846 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/1.log" Dec 02 07:47:58 crc kubenswrapper[4691]: I1202 07:47:58.147173 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerStarted","Data":"9a3b02e6e070c37eaa327e48115ddfe37fb61e3e7f06d3a121542f798fd2097f"} Dec 02 07:47:58 crc kubenswrapper[4691]: I1202 07:47:58.560947 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:47:58 crc kubenswrapper[4691]: I1202 07:47:58.561015 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:47:58 crc kubenswrapper[4691]: I1202 07:47:58.561015 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:47:58 crc kubenswrapper[4691]: E1202 07:47:58.561083 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:47:58 crc kubenswrapper[4691]: I1202 07:47:58.561129 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:47:58 crc kubenswrapper[4691]: E1202 07:47:58.561191 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:47:58 crc kubenswrapper[4691]: E1202 07:47:58.561250 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:47:58 crc kubenswrapper[4691]: E1202 07:47:58.561337 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:48:00 crc kubenswrapper[4691]: I1202 07:48:00.560579 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:00 crc kubenswrapper[4691]: I1202 07:48:00.560636 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:48:00 crc kubenswrapper[4691]: I1202 07:48:00.560687 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:48:00 crc kubenswrapper[4691]: I1202 07:48:00.560716 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:00 crc kubenswrapper[4691]: E1202 07:48:00.560778 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:48:00 crc kubenswrapper[4691]: E1202 07:48:00.561023 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:48:00 crc kubenswrapper[4691]: E1202 07:48:00.561185 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:48:00 crc kubenswrapper[4691]: E1202 07:48:00.561327 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:48:02 crc kubenswrapper[4691]: I1202 07:48:02.561071 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:02 crc kubenswrapper[4691]: I1202 07:48:02.561137 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:48:02 crc kubenswrapper[4691]: I1202 07:48:02.561153 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:02 crc kubenswrapper[4691]: E1202 07:48:02.562973 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 02 07:48:02 crc kubenswrapper[4691]: I1202 07:48:02.563068 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:48:02 crc kubenswrapper[4691]: E1202 07:48:02.563099 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 02 07:48:02 crc kubenswrapper[4691]: E1202 07:48:02.563425 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8lqps" podUID="b30f2d1f-53a1-4e87-819d-1e20bf3ed92a" Dec 02 07:48:02 crc kubenswrapper[4691]: E1202 07:48:02.563547 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 02 07:48:03 crc kubenswrapper[4691]: I1202 07:48:03.966545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.007866 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ng6l4"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.008565 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.008673 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-87dnb"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.009457 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.014353 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.014971 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.015320 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.015675 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.016445 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.018162 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.022018 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qvdbg"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.022522 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.024391 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jsqnn"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.027333 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jkx5c"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.027867 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.028083 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.028510 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.029632 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.030370 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.030966 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.031199 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.031469 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mcdgg"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.031618 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.031990 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.032128 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.032808 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.033134 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.033329 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.033518 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.033692 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.033928 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.028834 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.034224 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.034453 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.034694 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.035097 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.035814 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tcd8d"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.036486 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.040613 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.042684 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.043272 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.045139 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tbm97"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.045897 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.046540 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.047258 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.047791 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.047823 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.048143 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.048335 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kq8jr"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.048448 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.048821 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.122571 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.122640 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.122940 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.130074 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.132228 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.132373 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lb7g9"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.133188 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.139128 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.139456 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.139586 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.140994 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.141404 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.157981 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.159771 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.160032 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.160260 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.160502 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.160772 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.160915 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.161043 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.161175 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.161833 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.162794 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcf2998e-5d89-4f76-afc0-490573eae9d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.162895 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf65t\" (UniqueName: \"kubernetes.io/projected/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-kube-api-access-zf65t\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163006 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcf2998e-5d89-4f76-afc0-490573eae9d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163128 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163268 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163349 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-machine-approver-tls\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163408 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163484 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qscq2\" (UniqueName: \"kubernetes.io/projected/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-kube-api-access-qscq2\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163550 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-config\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163620 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdhlb\" (UniqueName: \"kubernetes.io/projected/ea5aa03a-f69e-4e94-8586-de42593bce47-kube-api-access-bdhlb\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163694 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163814 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5plbh\" (UniqueName: \"kubernetes.io/projected/dcf2998e-5d89-4f76-afc0-490573eae9d8-kube-api-access-5plbh\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163913 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-serving-cert\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164007 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-client-ca\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164073 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcf2998e-5d89-4f76-afc0-490573eae9d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea5aa03a-f69e-4e94-8586-de42593bce47-serving-cert\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164212 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf-metrics-tls\") pod \"dns-operator-744455d44c-tbm97\" (UID: \"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf\") " pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164281 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-auth-proxy-config\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164356 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mglkm\" (UniqueName: \"kubernetes.io/projected/4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf-kube-api-access-mglkm\") pod \"dns-operator-744455d44c-tbm97\" (UID: \"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf\") " pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164440 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjb4x\" (UniqueName: \"kubernetes.io/projected/db7d7a68-80e4-4e9f-a7b4-adcc14282d4d-kube-api-access-zjb4x\") pod \"downloads-7954f5f757-jsqnn\" (UID: \"db7d7a68-80e4-4e9f-a7b4-adcc14282d4d\") " pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.164511 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-config\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163551 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.165195 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163584 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.163609 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.165551 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167155 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167216 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167470 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167598 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167728 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167773 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167828 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167906 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.167963 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168072 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168163 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168197 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168279 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168337 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168426 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168477 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168525 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168586 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168627 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168696 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168832 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168886 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168941 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.168969 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169078 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169140 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169218 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169248 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169296 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169353 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169369 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169396 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169445 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169456 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169358 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169551 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169519 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169220 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169631 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169707 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.169795 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.172187 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.172435 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.172621 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.172856 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.173046 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.173209 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.173404 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.173928 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.174487 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.174832 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175074 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175294 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175526 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175677 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175816 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175925 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.175945 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.176160 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.176164 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.177363 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.178088 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.178315 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.178748 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.179302 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-h7dkw"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.179882 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.181870 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.182457 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.182619 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.182455 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.183155 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.183664 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.184598 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sv92r"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.185040 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.185404 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.186160 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.186409 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.186812 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.199289 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.200933 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.205502 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.214113 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.215777 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.216925 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-wx6m2"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.217046 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.217404 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-87dnb"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.217482 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.218401 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.218522 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.218946 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.221690 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.223256 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.224504 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.224502 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.225470 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.225780 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-p896d"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.227019 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.227902 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.228508 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.228886 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ncvcc"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.229231 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.230957 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.231266 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jlb9l"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.231657 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.234167 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.236885 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jkx5c"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.237803 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qvdbg"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.238700 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.239862 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jsqnn"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.241034 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.242061 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lb7g9"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.243128 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.244218 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mcdgg"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.245276 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.246410 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.247500 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.248890 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tbm97"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.250822 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.251367 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.254557 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2pv42"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.256070 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tcd8d"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.256125 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.258415 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wx6m2"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.262994 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.264931 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-serving-cert\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.264976 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcf2998e-5d89-4f76-afc0-490573eae9d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.264996 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-client-ca\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265014 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf-metrics-tls\") pod \"dns-operator-744455d44c-tbm97\" (UID: \"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf\") " pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265031 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea5aa03a-f69e-4e94-8586-de42593bce47-serving-cert\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265049 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-auth-proxy-config\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mglkm\" (UniqueName: \"kubernetes.io/projected/4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf-kube-api-access-mglkm\") pod \"dns-operator-744455d44c-tbm97\" (UID: \"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf\") " pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265096 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjb4x\" (UniqueName: \"kubernetes.io/projected/db7d7a68-80e4-4e9f-a7b4-adcc14282d4d-kube-api-access-zjb4x\") pod \"downloads-7954f5f757-jsqnn\" (UID: \"db7d7a68-80e4-4e9f-a7b4-adcc14282d4d\") " pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265116 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-config\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265134 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcf2998e-5d89-4f76-afc0-490573eae9d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265158 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf65t\" (UniqueName: \"kubernetes.io/projected/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-kube-api-access-zf65t\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265177 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qscq2\" (UniqueName: \"kubernetes.io/projected/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-kube-api-access-qscq2\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265196 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-config\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265213 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcf2998e-5d89-4f76-afc0-490573eae9d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265227 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-machine-approver-tls\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdhlb\" (UniqueName: \"kubernetes.io/projected/ea5aa03a-f69e-4e94-8586-de42593bce47-kube-api-access-bdhlb\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265262 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265285 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5plbh\" (UniqueName: \"kubernetes.io/projected/dcf2998e-5d89-4f76-afc0-490573eae9d8-kube-api-access-5plbh\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.265920 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8wz4g"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.268541 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.268786 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dcf2998e-5d89-4f76-afc0-490573eae9d8-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.270377 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-client-ca\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.270645 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-auth-proxy-config\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.270944 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-config\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.272043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.272381 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-config\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.273360 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-machine-approver-tls\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.273605 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.276585 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.276959 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-serving-cert\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.280732 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf-metrics-tls\") pod \"dns-operator-744455d44c-tbm97\" (UID: \"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf\") " pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.280793 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea5aa03a-f69e-4e94-8586-de42593bce47-serving-cert\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.280832 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.281194 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sv92r"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.282010 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dcf2998e-5d89-4f76-afc0-490573eae9d8-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.282196 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.283441 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.297212 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.305887 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.305935 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.308681 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kq8jr"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.310047 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-p896d"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.311266 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.312378 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ng6l4"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.313561 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.314776 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2pv42"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.316006 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.317421 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ncvcc"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.318506 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.319624 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jlb9l"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.320680 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.321820 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jcpp8"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.322575 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.323179 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hs9sj"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.323845 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.324528 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hs9sj"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.326000 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jcpp8"] Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.330482 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.351027 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366249 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-audit\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366289 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-audit-dir\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366316 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-encryption-config\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366336 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-node-pullsecrets\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366359 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366378 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-audit-policies\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366396 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-config\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366432 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/011633b2-37cb-46bd-b120-a9a9023d40fb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366448 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366465 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366482 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad8eb845-25ea-4917-80a0-46e84ed6ef97-serving-cert\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366497 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd9dk\" (UniqueName: \"kubernetes.io/projected/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-kube-api-access-kd9dk\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366529 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-certificates\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366562 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bgj9\" (UniqueName: \"kubernetes.io/projected/1e8a5b37-f843-459c-93ca-379044dbbec4-kube-api-access-7bgj9\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366604 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-config\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366645 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-serving-cert\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366675 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677bf100-9036-4b58-9658-6b918304ba47-config\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366695 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c7ffef5-c616-4596-aab4-6daa6eed6d46-serving-cert\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366710 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7b09b1-f092-4e2b-8b07-e03343753503-serving-cert\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366736 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-client-ca\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366802 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c7ffef5-c616-4596-aab4-6daa6eed6d46-trusted-ca\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366851 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366876 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drtn4\" (UniqueName: \"kubernetes.io/projected/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-kube-api-access-drtn4\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366894 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-tls\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366917 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-audit-dir\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366945 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e8a5b37-f843-459c-93ca-379044dbbec4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.366990 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367012 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bks2b\" (UniqueName: \"kubernetes.io/projected/ad8eb845-25ea-4917-80a0-46e84ed6ef97-kube-api-access-bks2b\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367036 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-bound-sa-token\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367052 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367065 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-service-ca-bundle\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367085 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367109 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-etcd-serving-ca\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367128 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5qq7\" (UniqueName: \"kubernetes.io/projected/42a8f203-7e80-4e55-bd79-afe843279906-kube-api-access-c5qq7\") pod \"cluster-samples-operator-665b6dd947-8kj4z\" (UID: \"42a8f203-7e80-4e55-bd79-afe843279906\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367177 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/011633b2-37cb-46bd-b120-a9a9023d40fb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367196 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367239 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367255 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-etcd-client\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367285 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367325 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-policies\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-serving-cert\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367398 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzrp\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-kube-api-access-pfzrp\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367413 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpdrb\" (UniqueName: \"kubernetes.io/projected/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-kube-api-access-vpdrb\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-config\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367446 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-trusted-ca\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367460 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367474 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c7ffef5-c616-4596-aab4-6daa6eed6d46-config\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367531 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367545 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxxt\" (UniqueName: \"kubernetes.io/projected/5c7ffef5-c616-4596-aab4-6daa6eed6d46-kube-api-access-twxxt\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.367582 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:04.867571343 +0000 UTC m=+132.651650205 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367591 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-etcd-client\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367606 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-dir\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367624 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367683 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8a5b37-f843-459c-93ca-379044dbbec4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a8f203-7e80-4e55-bd79-afe843279906-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8kj4z\" (UID: \"42a8f203-7e80-4e55-bd79-afe843279906\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367829 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367877 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367957 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-image-import-ca\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.367982 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-encryption-config\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368039 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368064 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368086 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkm8d\" (UniqueName: \"kubernetes.io/projected/eb7b09b1-f092-4e2b-8b07-e03343753503-kube-api-access-wkm8d\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368102 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368123 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/677bf100-9036-4b58-9658-6b918304ba47-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368144 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqlv2\" (UniqueName: \"kubernetes.io/projected/677bf100-9036-4b58-9658-6b918304ba47-kube-api-access-fqlv2\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368232 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368276 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2bxr\" (UniqueName: \"kubernetes.io/projected/011633b2-37cb-46bd-b120-a9a9023d40fb-kube-api-access-m2bxr\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.368298 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/677bf100-9036-4b58-9658-6b918304ba47-images\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.370325 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.390489 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.412112 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.431667 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.457333 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.468750 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.468930 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.468969 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.468988 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:04.968966673 +0000 UTC m=+132.753045525 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469011 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkwgt\" (UniqueName: \"kubernetes.io/projected/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-kube-api-access-rkwgt\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469039 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-socket-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469061 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sx8m\" (UniqueName: \"kubernetes.io/projected/1eb687f4-51f6-4806-b5a0-e35639b4b019-kube-api-access-6sx8m\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469082 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01d8467f-e617-4b68-ada8-440891bb4b51-srv-cert\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469102 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql5t7\" (UniqueName: \"kubernetes.io/projected/c8c8387e-bdc5-4d7a-ae05-776786ee7277-kube-api-access-ql5t7\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469127 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bks2b\" (UniqueName: \"kubernetes.io/projected/ad8eb845-25ea-4917-80a0-46e84ed6ef97-kube-api-access-bks2b\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469150 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469175 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh2b4\" (UniqueName: \"kubernetes.io/projected/54581392-8f19-498b-b24d-c35064382946-kube-api-access-hh2b4\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469199 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec8623c-5335-43a5-8431-49e66d87b4f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469220 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469265 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-etcd-client\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469403 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-config\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2b6t\" (UniqueName: \"kubernetes.io/projected/20e58619-0441-4d0e-9542-b2b8948099ef-kube-api-access-z2b6t\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469476 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469501 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469560 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469583 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-serving-cert\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469626 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/349309ae-e421-4055-96a2-0480c5562853-proxy-tls\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469649 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls22m\" (UniqueName: \"kubernetes.io/projected/dfefdd27-4ee7-4194-990d-0199fbe83a47-kube-api-access-ls22m\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469669 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw5cj\" (UniqueName: \"kubernetes.io/projected/483d25f5-c04d-4c89-a260-59cc01d12255-kube-api-access-qw5cj\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469785 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.469925 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzrp\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-kube-api-access-pfzrp\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470063 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-default-certificate\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.470099 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:04.970080212 +0000 UTC m=+132.754159154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470143 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lkjj\" (UniqueName: \"kubernetes.io/projected/33710686-f504-4a80-a2d4-e42b0bfb3033-kube-api-access-4lkjj\") pod \"package-server-manager-789f6589d5-5qs4n\" (UID: \"33710686-f504-4a80-a2d4-e42b0bfb3033\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470178 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4244w\" (UniqueName: \"kubernetes.io/projected/688a963d-2808-4961-a584-1ee4a3ada61d-kube-api-access-4244w\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470214 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470243 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470272 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-etcd-client\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470285 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470297 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-config-volume\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470343 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-encryption-config\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470537 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24830de-18e3-4204-8996-e7f1b0d45aec-config\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470565 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/483d25f5-c04d-4c89-a260-59cc01d12255-webhook-cert\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0d511610-c204-4e20-a95e-43a7b41332b8-signing-key\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470607 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/33710686-f504-4a80-a2d4-e42b0bfb3033-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5qs4n\" (UID: \"33710686-f504-4a80-a2d4-e42b0bfb3033\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470634 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470644 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ec8623c-5335-43a5-8431-49e66d87b4f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470681 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-plugins-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54581392-8f19-498b-b24d-c35064382946-proxy-tls\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470862 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/677bf100-9036-4b58-9658-6b918304ba47-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470896 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqlv2\" (UniqueName: \"kubernetes.io/projected/677bf100-9036-4b58-9658-6b918304ba47-kube-api-access-fqlv2\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470921 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470945 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-oauth-config\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470961 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zscd5\" (UniqueName: \"kubernetes.io/projected/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-kube-api-access-zscd5\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470979 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.470994 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-csi-data-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471011 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-audit\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471030 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b63660-5845-41eb-94ca-ffc8ccb34413-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8x9pv\" (UID: \"39b63660-5845-41eb-94ca-ffc8ccb34413\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471048 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bqr\" (UniqueName: \"kubernetes.io/projected/3cf0e12e-115d-4ac5-b646-6ac98524d948-kube-api-access-49bqr\") pod \"migrator-59844c95c7-4ms2c\" (UID: \"3cf0e12e-115d-4ac5-b646-6ac98524d948\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471068 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471082 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh9k4\" (UniqueName: \"kubernetes.io/projected/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-kube-api-access-lh9k4\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471109 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-audit-dir\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471138 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-node-pullsecrets\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471153 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-audit-policies\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471170 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471186 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86sks\" (UniqueName: \"kubernetes.io/projected/943a92a5-ba00-456a-83f4-c383e252288a-kube-api-access-86sks\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471205 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/011633b2-37cb-46bd-b120-a9a9023d40fb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471224 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/943a92a5-ba00-456a-83f4-c383e252288a-config-volume\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471245 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-certificates\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471261 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-service-ca\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-metrics-certs\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471294 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-service-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471333 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-client\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471350 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-config\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471366 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-cert\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471382 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-serving-cert\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c7ffef5-c616-4596-aab4-6daa6eed6d46-serving-cert\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471416 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-srv-cert\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471443 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-client-ca\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471459 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c7ffef5-c616-4596-aab4-6daa6eed6d46-trusted-ca\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471475 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-serving-cert\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471506 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471522 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t58tg\" (UniqueName: \"kubernetes.io/projected/0d511610-c204-4e20-a95e-43a7b41332b8-kube-api-access-t58tg\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471531 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471537 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-serving-cert\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471607 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c561340-bdf6-4d50-85c4-10a9098e12b7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471649 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54581392-8f19-498b-b24d-c35064382946-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471674 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8c8387e-bdc5-4d7a-ae05-776786ee7277-trusted-ca\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471702 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ec8623c-5335-43a5-8431-49e66d87b4f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8c8387e-bdc5-4d7a-ae05-776786ee7277-bound-sa-token\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471795 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpg4q\" (UniqueName: \"kubernetes.io/projected/2c561340-bdf6-4d50-85c4-10a9098e12b7-kube-api-access-bpg4q\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471833 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-bound-sa-token\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471857 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnjdw\" (UniqueName: \"kubernetes.io/projected/01d8467f-e617-4b68-ada8-440891bb4b51-kube-api-access-wnjdw\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471882 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-service-ca-bundle\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471899 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471905 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-etcd-serving-ca\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471928 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-metrics-tls\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471952 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/011633b2-37cb-46bd-b120-a9a9023d40fb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471976 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5qq7\" (UniqueName: \"kubernetes.io/projected/42a8f203-7e80-4e55-bd79-afe843279906-kube-api-access-c5qq7\") pod \"cluster-samples-operator-665b6dd947-8kj4z\" (UID: \"42a8f203-7e80-4e55-bd79-afe843279906\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.471999 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f24830de-18e3-4204-8996-e7f1b0d45aec-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.472025 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-policies\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.472097 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-audit-dir\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.472334 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.472698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-audit\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.472864 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.473054 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzsqc\" (UniqueName: \"kubernetes.io/projected/349309ae-e421-4055-96a2-0480c5562853-kube-api-access-lzsqc\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.473126 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-node-pullsecrets\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.473235 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-config\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.473991 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474020 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-policies\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474227 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-encryption-config\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474537 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c7ffef5-c616-4596-aab4-6daa6eed6d46-trusted-ca\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474657 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-client-ca\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474867 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-serving-cert\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474917 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-service-ca-bundle\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.474971 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce16c153-56c1-4d91-92db-e52b045186bb-config\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475025 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/943a92a5-ba00-456a-83f4-c383e252288a-secret-volume\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475053 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpdrb\" (UniqueName: \"kubernetes.io/projected/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-kube-api-access-vpdrb\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475079 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-service-ca-bundle\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475104 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f24830de-18e3-4204-8996-e7f1b0d45aec-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475131 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-config\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475157 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01d8467f-e617-4b68-ada8-440891bb4b51-profile-collector-cert\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475182 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce16c153-56c1-4d91-92db-e52b045186bb-serving-cert\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475210 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-trusted-ca\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475233 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c7ffef5-c616-4596-aab4-6daa6eed6d46-config\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twxxt\" (UniqueName: \"kubernetes.io/projected/5c7ffef5-c616-4596-aab4-6daa6eed6d46-kube-api-access-twxxt\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475282 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475319 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8a5b37-f843-459c-93ca-379044dbbec4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a8f203-7e80-4e55-bd79-afe843279906-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8kj4z\" (UID: \"42a8f203-7e80-4e55-bd79-afe843279906\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475493 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-dir\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475535 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475562 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/483d25f5-c04d-4c89-a260-59cc01d12255-apiservice-cert\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475586 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-image-import-ca\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-registration-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475631 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-certs\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475742 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkm8d\" (UniqueName: \"kubernetes.io/projected/eb7b09b1-f092-4e2b-8b07-e03343753503-kube-api-access-wkm8d\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475787 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475814 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475848 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5d2fe571-0e98-42f1-8f71-5cdc773ec89e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lb7g9\" (UID: \"5d2fe571-0e98-42f1-8f71-5cdc773ec89e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475869 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-stats-auth\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475896 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2bxr\" (UniqueName: \"kubernetes.io/projected/011633b2-37cb-46bd-b120-a9a9023d40fb-kube-api-access-m2bxr\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475919 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/677bf100-9036-4b58-9658-6b918304ba47-images\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475944 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-config\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475967 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/483d25f5-c04d-4c89-a260-59cc01d12255-tmpfs\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475973 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-dir\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475988 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8c8387e-bdc5-4d7a-ae05-776786ee7277-metrics-tls\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476012 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-encryption-config\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476033 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcvzn\" (UniqueName: \"kubernetes.io/projected/39b63660-5845-41eb-94ca-ffc8ccb34413-kube-api-access-xcvzn\") pod \"control-plane-machine-set-operator-78cbb6b69f-8x9pv\" (UID: \"39b63660-5845-41eb-94ca-ffc8ccb34413\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476057 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x7pm\" (UniqueName: \"kubernetes.io/projected/ce16c153-56c1-4d91-92db-e52b045186bb-kube-api-access-4x7pm\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476079 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476101 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlk6\" (UniqueName: \"kubernetes.io/projected/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-kube-api-access-4nlk6\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54581392-8f19-498b-b24d-c35064382946-images\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476152 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/349309ae-e421-4055-96a2-0480c5562853-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476175 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-config\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476198 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476220 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bgj9\" (UniqueName: \"kubernetes.io/projected/1e8a5b37-f843-459c-93ca-379044dbbec4-kube-api-access-7bgj9\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476266 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.475341 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476290 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-node-bootstrap-token\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476581 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476593 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0d511610-c204-4e20-a95e-43a7b41332b8-signing-cabundle\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476640 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad8eb845-25ea-4917-80a0-46e84ed6ef97-serving-cert\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476666 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd9dk\" (UniqueName: \"kubernetes.io/projected/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-kube-api-access-kd9dk\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476703 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677bf100-9036-4b58-9658-6b918304ba47-config\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476729 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rksj\" (UniqueName: \"kubernetes.io/projected/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-kube-api-access-8rksj\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476771 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7b09b1-f092-4e2b-8b07-e03343753503-serving-cert\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476804 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-oauth-serving-cert\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476827 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476854 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c561340-bdf6-4d50-85c4-10a9098e12b7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476883 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drtn4\" (UniqueName: \"kubernetes.io/projected/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-kube-api-access-drtn4\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.476906 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-trusted-ca-bundle\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.477105 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad8eb845-25ea-4917-80a0-46e84ed6ef97-config\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.477601 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-trusted-ca\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.477737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-tls\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478056 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-mountpoint-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478071 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-audit-policies\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478103 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6rbn\" (UniqueName: \"kubernetes.io/projected/5d2fe571-0e98-42f1-8f71-5cdc773ec89e-kube-api-access-w6rbn\") pod \"multus-admission-controller-857f4d67dd-lb7g9\" (UID: \"5d2fe571-0e98-42f1-8f71-5cdc773ec89e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478191 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-audit-dir\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478226 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e8a5b37-f843-459c-93ca-379044dbbec4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478644 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-audit-dir\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478726 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-certificates\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.478747 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.479609 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-encryption-config\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.479641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.479682 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-image-import-ca\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.479708 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/011633b2-37cb-46bd-b120-a9a9023d40fb-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.479977 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c7ffef5-c616-4596-aab4-6daa6eed6d46-config\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.480074 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.480193 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-etcd-serving-ca\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.480902 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7b09b1-f092-4e2b-8b07-e03343753503-serving-cert\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481264 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/677bf100-9036-4b58-9658-6b918304ba47-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481287 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-config\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481378 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-serving-cert\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481471 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e8a5b37-f843-459c-93ca-379044dbbec4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481530 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481876 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677bf100-9036-4b58-9658-6b918304ba47-config\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.481987 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-etcd-client\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482072 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/677bf100-9036-4b58-9658-6b918304ba47-images\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482079 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-config\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482107 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/42a8f203-7e80-4e55-bd79-afe843279906-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-8kj4z\" (UID: \"42a8f203-7e80-4e55-bd79-afe843279906\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482249 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482503 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482567 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c7ffef5-c616-4596-aab4-6daa6eed6d46-serving-cert\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.482929 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.483029 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1e8a5b37-f843-459c-93ca-379044dbbec4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.483082 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/011633b2-37cb-46bd-b120-a9a9023d40fb-config\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.483457 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.483720 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-tls\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.484034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad8eb845-25ea-4917-80a0-46e84ed6ef97-serving-cert\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.484287 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-etcd-client\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.491587 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.510537 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.531018 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.551031 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.560770 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.560826 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.560920 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.561121 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.572015 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579290 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.579437 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.079417271 +0000 UTC m=+132.863496133 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579469 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01d8467f-e617-4b68-ada8-440891bb4b51-profile-collector-cert\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579503 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce16c153-56c1-4d91-92db-e52b045186bb-serving-cert\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579539 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-registration-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579560 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-certs\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579581 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/483d25f5-c04d-4c89-a260-59cc01d12255-apiservice-cert\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579638 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5d2fe571-0e98-42f1-8f71-5cdc773ec89e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lb7g9\" (UID: \"5d2fe571-0e98-42f1-8f71-5cdc773ec89e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579663 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-stats-auth\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579696 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-config\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579716 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/483d25f5-c04d-4c89-a260-59cc01d12255-tmpfs\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8c8387e-bdc5-4d7a-ae05-776786ee7277-metrics-tls\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579785 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcvzn\" (UniqueName: \"kubernetes.io/projected/39b63660-5845-41eb-94ca-ffc8ccb34413-kube-api-access-xcvzn\") pod \"control-plane-machine-set-operator-78cbb6b69f-8x9pv\" (UID: \"39b63660-5845-41eb-94ca-ffc8ccb34413\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579807 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x7pm\" (UniqueName: \"kubernetes.io/projected/ce16c153-56c1-4d91-92db-e52b045186bb-kube-api-access-4x7pm\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579836 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nlk6\" (UniqueName: \"kubernetes.io/projected/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-kube-api-access-4nlk6\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579857 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54581392-8f19-498b-b24d-c35064382946-images\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-registration-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579878 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/349309ae-e421-4055-96a2-0480c5562853-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579915 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579936 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-node-bootstrap-token\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.579956 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0d511610-c204-4e20-a95e-43a7b41332b8-signing-cabundle\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580001 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rksj\" (UniqueName: \"kubernetes.io/projected/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-kube-api-access-8rksj\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580026 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-oauth-serving-cert\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580049 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580069 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c561340-bdf6-4d50-85c4-10a9098e12b7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580095 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-trusted-ca-bundle\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580120 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-mountpoint-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580141 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6rbn\" (UniqueName: \"kubernetes.io/projected/5d2fe571-0e98-42f1-8f71-5cdc773ec89e-kube-api-access-w6rbn\") pod \"multus-admission-controller-857f4d67dd-lb7g9\" (UID: \"5d2fe571-0e98-42f1-8f71-5cdc773ec89e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580165 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580197 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580222 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkwgt\" (UniqueName: \"kubernetes.io/projected/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-kube-api-access-rkwgt\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580239 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/483d25f5-c04d-4c89-a260-59cc01d12255-tmpfs\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-socket-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580303 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sx8m\" (UniqueName: \"kubernetes.io/projected/1eb687f4-51f6-4806-b5a0-e35639b4b019-kube-api-access-6sx8m\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580331 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-socket-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580351 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-mountpoint-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580333 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01d8467f-e617-4b68-ada8-440891bb4b51-srv-cert\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580586 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql5t7\" (UniqueName: \"kubernetes.io/projected/c8c8387e-bdc5-4d7a-ae05-776786ee7277-kube-api-access-ql5t7\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580614 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh2b4\" (UniqueName: \"kubernetes.io/projected/54581392-8f19-498b-b24d-c35064382946-kube-api-access-hh2b4\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580637 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec8623c-5335-43a5-8431-49e66d87b4f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580681 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-config\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580708 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2b6t\" (UniqueName: \"kubernetes.io/projected/20e58619-0441-4d0e-9542-b2b8948099ef-kube-api-access-z2b6t\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580795 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls22m\" (UniqueName: \"kubernetes.io/projected/dfefdd27-4ee7-4194-990d-0199fbe83a47-kube-api-access-ls22m\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580827 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw5cj\" (UniqueName: \"kubernetes.io/projected/483d25f5-c04d-4c89-a260-59cc01d12255-kube-api-access-qw5cj\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580859 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/349309ae-e421-4055-96a2-0480c5562853-proxy-tls\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580902 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lkjj\" (UniqueName: \"kubernetes.io/projected/33710686-f504-4a80-a2d4-e42b0bfb3033-kube-api-access-4lkjj\") pod \"package-server-manager-789f6589d5-5qs4n\" (UID: \"33710686-f504-4a80-a2d4-e42b0bfb3033\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580925 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-default-certificate\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580950 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4244w\" (UniqueName: \"kubernetes.io/projected/688a963d-2808-4961-a584-1ee4a3ada61d-kube-api-access-4244w\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.580977 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-config-volume\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24830de-18e3-4204-8996-e7f1b0d45aec-config\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581023 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/483d25f5-c04d-4c89-a260-59cc01d12255-webhook-cert\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581047 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0d511610-c204-4e20-a95e-43a7b41332b8-signing-key\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581071 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/33710686-f504-4a80-a2d4-e42b0bfb3033-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5qs4n\" (UID: \"33710686-f504-4a80-a2d4-e42b0bfb3033\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581094 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-plugins-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581128 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54581392-8f19-498b-b24d-c35064382946-proxy-tls\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581171 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ec8623c-5335-43a5-8431-49e66d87b4f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.581193 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.081181447 +0000 UTC m=+132.865260309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581230 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-oauth-config\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zscd5\" (UniqueName: \"kubernetes.io/projected/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-kube-api-access-zscd5\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-csi-data-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581314 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b63660-5845-41eb-94ca-ffc8ccb34413-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8x9pv\" (UID: \"39b63660-5845-41eb-94ca-ffc8ccb34413\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581339 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bqr\" (UniqueName: \"kubernetes.io/projected/3cf0e12e-115d-4ac5-b646-6ac98524d948-kube-api-access-49bqr\") pod \"migrator-59844c95c7-4ms2c\" (UID: \"3cf0e12e-115d-4ac5-b646-6ac98524d948\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581381 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh9k4\" (UniqueName: \"kubernetes.io/projected/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-kube-api-access-lh9k4\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581433 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581444 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/349309ae-e421-4055-96a2-0480c5562853-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581495 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86sks\" (UniqueName: \"kubernetes.io/projected/943a92a5-ba00-456a-83f4-c383e252288a-kube-api-access-86sks\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581521 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/943a92a5-ba00-456a-83f4-c383e252288a-config-volume\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581551 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-service-ca\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581571 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-metrics-certs\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581594 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-service-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581620 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-client\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581642 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-cert\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581652 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-csi-data-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-srv-cert\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581729 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-serving-cert\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581752 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t58tg\" (UniqueName: \"kubernetes.io/projected/0d511610-c204-4e20-a95e-43a7b41332b8-kube-api-access-t58tg\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581792 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-serving-cert\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581813 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c561340-bdf6-4d50-85c4-10a9098e12b7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581836 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54581392-8f19-498b-b24d-c35064382946-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581859 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8c8387e-bdc5-4d7a-ae05-776786ee7277-trusted-ca\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581891 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ec8623c-5335-43a5-8431-49e66d87b4f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581917 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpg4q\" (UniqueName: \"kubernetes.io/projected/2c561340-bdf6-4d50-85c4-10a9098e12b7-kube-api-access-bpg4q\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581938 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8c8387e-bdc5-4d7a-ae05-776786ee7277-bound-sa-token\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.581968 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnjdw\" (UniqueName: \"kubernetes.io/projected/01d8467f-e617-4b68-ada8-440891bb4b51-kube-api-access-wnjdw\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582013 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-metrics-tls\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582038 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f24830de-18e3-4204-8996-e7f1b0d45aec-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582067 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzsqc\" (UniqueName: \"kubernetes.io/projected/349309ae-e421-4055-96a2-0480c5562853-kube-api-access-lzsqc\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582090 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-config\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582111 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce16c153-56c1-4d91-92db-e52b045186bb-config\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582136 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-service-ca-bundle\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582158 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f24830de-18e3-4204-8996-e7f1b0d45aec-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582180 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/943a92a5-ba00-456a-83f4-c383e252288a-secret-volume\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582241 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/1eb687f4-51f6-4806-b5a0-e35639b4b019-plugins-dir\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.582286 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f24830de-18e3-4204-8996-e7f1b0d45aec-config\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.583122 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c561340-bdf6-4d50-85c4-10a9098e12b7-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.583168 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/54581392-8f19-498b-b24d-c35064382946-auth-proxy-config\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.584077 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ec8623c-5335-43a5-8431-49e66d87b4f5-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.584086 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2c561340-bdf6-4d50-85c4-10a9098e12b7-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.584203 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c8c8387e-bdc5-4d7a-ae05-776786ee7277-metrics-tls\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.584371 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c8c8387e-bdc5-4d7a-ae05-776786ee7277-trusted-ca\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.584877 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5d2fe571-0e98-42f1-8f71-5cdc773ec89e-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-lb7g9\" (UID: \"5d2fe571-0e98-42f1-8f71-5cdc773ec89e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.584915 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ec8623c-5335-43a5-8431-49e66d87b4f5-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.586163 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f24830de-18e3-4204-8996-e7f1b0d45aec-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.592746 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.610793 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.622083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/54581392-8f19-498b-b24d-c35064382946-images\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.630232 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.650290 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.656080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/54581392-8f19-498b-b24d-c35064382946-proxy-tls\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.670893 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.683172 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.683316 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.183301356 +0000 UTC m=+132.967380218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.683648 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.684054 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.184037975 +0000 UTC m=+132.968116837 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.691028 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.711689 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.716100 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/33710686-f504-4a80-a2d4-e42b0bfb3033-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-5qs4n\" (UID: \"33710686-f504-4a80-a2d4-e42b0bfb3033\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.732253 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.751238 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.764965 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-stats-auth\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.770809 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.784462 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.784581 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.284555223 +0000 UTC m=+133.068634125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.785407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.785887 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.285865357 +0000 UTC m=+133.069944259 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.791567 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.811402 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.825097 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-default-certificate\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.831111 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.838213 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-metrics-certs\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.852735 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.864131 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-service-ca-bundle\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.870987 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.886987 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.887094 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.387074332 +0000 UTC m=+133.171153224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.887562 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.888147 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.388090609 +0000 UTC m=+133.172169481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.891841 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.896313 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/39b63660-5845-41eb-94ca-ffc8ccb34413-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8x9pv\" (UID: \"39b63660-5845-41eb-94ca-ffc8ccb34413\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.913692 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.931123 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.951543 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.971382 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.983739 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.988787 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.989044 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.489014867 +0000 UTC m=+133.273093779 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.989158 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:04 crc kubenswrapper[4691]: E1202 07:48:04.989558 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.489543351 +0000 UTC m=+133.273622223 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:04 crc kubenswrapper[4691]: I1202 07:48:04.990560 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.000626 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-config\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.019330 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.022459 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.031242 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.051227 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.064146 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.071882 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.091338 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.091392 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.091513 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.591487875 +0000 UTC m=+133.375566747 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.092239 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.092886 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.592862541 +0000 UTC m=+133.376941413 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.111131 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.116227 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/349309ae-e421-4055-96a2-0480c5562853-proxy-tls\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.132661 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.150978 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.158420 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/943a92a5-ba00-456a-83f4-c383e252288a-secret-volume\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.158753 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-profile-collector-cert\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.165155 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/01d8467f-e617-4b68-ada8-440891bb4b51-profile-collector-cert\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.170717 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.185781 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/01d8467f-e617-4b68-ada8-440891bb4b51-srv-cert\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.191802 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.193502 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.193648 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.693628435 +0000 UTC m=+133.477707297 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.194087 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.194354 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.694345274 +0000 UTC m=+133.478424146 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.199859 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-srv-cert\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.211989 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.213831 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-service-ca\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.229525 4691 request.go:700] Waited for 1.011779604s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/secrets?fieldSelector=metadata.name%3Dconsole-dockercfg-f62pw&limit=500&resourceVersion=0 Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.231595 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.251793 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.256720 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-serving-cert\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.270706 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.275395 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-oauth-config\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.295121 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.295296 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.795274022 +0000 UTC m=+133.579352894 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.295968 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.296337 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.796317569 +0000 UTC m=+133.580396441 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.307260 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.312287 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.317423 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-trusted-ca-bundle\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.321832 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-oauth-serving-cert\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.331797 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.343281 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-config\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.351665 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.371246 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.390345 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.397039 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.397228 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.897205157 +0000 UTC m=+133.681284039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.397581 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.397964 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.897952096 +0000 UTC m=+133.682030968 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.411479 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.424035 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/483d25f5-c04d-4c89-a260-59cc01d12255-apiservice-cert\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.424281 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/483d25f5-c04d-4c89-a260-59cc01d12255-webhook-cert\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.431207 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.452262 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.472100 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.483615 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce16c153-56c1-4d91-92db-e52b045186bb-serving-cert\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.491753 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.499203 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.499321 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:05.99929525 +0000 UTC m=+133.783374142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.499569 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.500386 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.000371758 +0000 UTC m=+133.784450630 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.511325 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.513450 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce16c153-56c1-4d91-92db-e52b045186bb-config\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.530649 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.534600 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/943a92a5-ba00-456a-83f4-c383e252288a-config-volume\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.551421 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.571159 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.575119 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/0d511610-c204-4e20-a95e-43a7b41332b8-signing-key\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.579905 4691 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.579966 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-certs podName:20e58619-0441-4d0e-9542-b2b8948099ef nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.079950815 +0000 UTC m=+133.864029697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-certs") pod "machine-config-server-8wz4g" (UID: "20e58619-0441-4d0e-9542-b2b8948099ef") : failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581298 4691 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581358 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0d511610-c204-4e20-a95e-43a7b41332b8-signing-cabundle podName:0d511610-c204-4e20-a95e-43a7b41332b8 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.081341871 +0000 UTC m=+133.865420743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/0d511610-c204-4e20-a95e-43a7b41332b8-signing-cabundle") pod "service-ca-9c57cc56f-ncvcc" (UID: "0d511610-c204-4e20-a95e-43a7b41332b8") : failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581446 4691 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581476 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-config-volume podName:366b4073-3bd7-4d0f-ad0e-713f71cd8b5c nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.081467484 +0000 UTC m=+133.865546356 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-config-volume") pod "dns-default-jcpp8" (UID: "366b4073-3bd7-4d0f-ad0e-713f71cd8b5c") : failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581503 4691 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581527 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-ca podName:dfefdd27-4ee7-4194-990d-0199fbe83a47 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.081519775 +0000 UTC m=+133.865598647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-ca" (UniqueName: "kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-ca") pod "etcd-operator-b45778765-jlb9l" (UID: "dfefdd27-4ee7-4194-990d-0199fbe83a47") : failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581542 4691 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.581563 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-node-bootstrap-token podName:20e58619-0441-4d0e-9542-b2b8948099ef nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.081556816 +0000 UTC m=+133.865635688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-node-bootstrap-token") pod "machine-config-server-8wz4g" (UID: "20e58619-0441-4d0e-9542-b2b8948099ef") : failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.582685 4691 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-client: failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.582718 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-client podName:dfefdd27-4ee7-4194-990d-0199fbe83a47 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.082710266 +0000 UTC m=+133.866789148 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-client") pod "etcd-operator-b45778765-jlb9l" (UID: "dfefdd27-4ee7-4194-990d-0199fbe83a47") : failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.582734 4691 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.582756 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-cert podName:27fa245f-f203-4dfd-9ca1-78bc7c9e17a6 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.082749757 +0000 UTC m=+133.866828629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-cert") pod "ingress-canary-hs9sj" (UID: "27fa245f-f203-4dfd-9ca1-78bc7c9e17a6") : failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.582821 4691 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.582845 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-service-ca podName:dfefdd27-4ee7-4194-990d-0199fbe83a47 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.082837769 +0000 UTC m=+133.866916651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-service-ca" (UniqueName: "kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-service-ca") pod "etcd-operator-b45778765-jlb9l" (UID: "dfefdd27-4ee7-4194-990d-0199fbe83a47") : failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.583037 4691 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.583183 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-metrics-tls podName:366b4073-3bd7-4d0f-ad0e-713f71cd8b5c nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.083142287 +0000 UTC m=+133.867221139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-metrics-tls") pod "dns-default-jcpp8" (UID: "366b4073-3bd7-4d0f-ad0e-713f71cd8b5c") : failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.583048 4691 configmap.go:193] Couldn't get configMap openshift-etcd-operator/etcd-operator-config: failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.583257 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-config podName:dfefdd27-4ee7-4194-990d-0199fbe83a47 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.08324874 +0000 UTC m=+133.867327602 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-config") pod "etcd-operator-b45778765-jlb9l" (UID: "dfefdd27-4ee7-4194-990d-0199fbe83a47") : failed to sync configmap cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.583779 4691 secret.go:188] Couldn't get secret openshift-etcd-operator/etcd-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.583853 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-serving-cert podName:dfefdd27-4ee7-4194-990d-0199fbe83a47 nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.083842265 +0000 UTC m=+133.867921127 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-serving-cert") pod "etcd-operator-b45778765-jlb9l" (UID: "dfefdd27-4ee7-4194-990d-0199fbe83a47") : failed to sync secret cache: timed out waiting for the condition Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.591537 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.600617 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.600947 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.100909712 +0000 UTC m=+133.884988614 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.601046 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.601997 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.101983759 +0000 UTC m=+133.886062631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.611809 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.630439 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.651342 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.672379 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.691406 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.702996 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.703285 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.203239231 +0000 UTC m=+133.987318163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.704188 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.704654 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.204630947 +0000 UTC m=+133.988709819 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.711109 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.730705 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.751022 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.771226 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.792088 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.805472 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.805682 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.305655053 +0000 UTC m=+134.089733935 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.805840 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.806359 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.306343411 +0000 UTC m=+134.090422283 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.811800 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.832577 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.852162 4691 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.872198 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.907723 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.907907 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.407883631 +0000 UTC m=+134.191962503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.909448 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5plbh\" (UniqueName: \"kubernetes.io/projected/dcf2998e-5d89-4f76-afc0-490573eae9d8-kube-api-access-5plbh\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.910433 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:05 crc kubenswrapper[4691]: E1202 07:48:05.910996 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.41097854 +0000 UTC m=+134.195057492 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.926813 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf65t\" (UniqueName: \"kubernetes.io/projected/d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6-kube-api-access-zf65t\") pod \"openshift-config-operator-7777fb866f-nl6hp\" (UID: \"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.931601 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.951608 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 07:48:05 crc kubenswrapper[4691]: I1202 07:48:05.971429 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.006683 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjb4x\" (UniqueName: \"kubernetes.io/projected/db7d7a68-80e4-4e9f-a7b4-adcc14282d4d-kube-api-access-zjb4x\") pod \"downloads-7954f5f757-jsqnn\" (UID: \"db7d7a68-80e4-4e9f-a7b4-adcc14282d4d\") " pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.012534 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.012923 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.512893399 +0000 UTC m=+134.296972271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.013288 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.013800 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.513740791 +0000 UTC m=+134.297819693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.026415 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mglkm\" (UniqueName: \"kubernetes.io/projected/4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf-kube-api-access-mglkm\") pod \"dns-operator-744455d44c-tbm97\" (UID: \"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf\") " pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.036672 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.043134 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.047158 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dcf2998e-5d89-4f76-afc0-490573eae9d8-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-ncsfj\" (UID: \"dcf2998e-5d89-4f76-afc0-490573eae9d8\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.068090 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdhlb\" (UniqueName: \"kubernetes.io/projected/ea5aa03a-f69e-4e94-8586-de42593bce47-kube-api-access-bdhlb\") pod \"route-controller-manager-6576b87f9c-bdlzb\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.085479 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qscq2\" (UniqueName: \"kubernetes.io/projected/7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5-kube-api-access-qscq2\") pod \"machine-approver-56656f9798-j4hpv\" (UID: \"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.110581 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.117377 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.117936 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0d511610-c204-4e20-a95e-43a7b41332b8-signing-cabundle\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.117982 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-node-bootstrap-token\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118023 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118200 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-config-volume\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118288 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-service-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118319 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-client\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118344 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-cert\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118395 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-serving-cert\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118484 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-metrics-tls\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118532 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-config\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.118578 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-certs\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.118779 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.618730109 +0000 UTC m=+134.402808971 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.119233 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.119417 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-service-ca\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.120058 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfefdd27-4ee7-4194-990d-0199fbe83a47-config\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.120070 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/0d511610-c204-4e20-a95e-43a7b41332b8-signing-cabundle\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.121833 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-node-bootstrap-token\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.121879 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/20e58619-0441-4d0e-9542-b2b8948099ef-certs\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.122010 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-etcd-client\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.122276 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfefdd27-4ee7-4194-990d-0199fbe83a47-serving-cert\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.132592 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.145161 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-metrics-tls\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.151127 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.159676 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-config-volume\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.170965 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.191508 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.193217 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.197586 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.205729 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-cert\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:06 crc kubenswrapper[4691]: W1202 07:48:06.207449 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9bf02c0_4f72_4d0c_8355_b0aa05b02ec6.slice/crio-f067e7ad0b3a8eb77ef75cbb9fdc56f1f5dbbedd68dfdbb7f1e1c638e1146811 WatchSource:0}: Error finding container f067e7ad0b3a8eb77ef75cbb9fdc56f1f5dbbedd68dfdbb7f1e1c638e1146811: Status 404 returned error can't find the container with id f067e7ad0b3a8eb77ef75cbb9fdc56f1f5dbbedd68dfdbb7f1e1c638e1146811 Dec 02 07:48:06 crc kubenswrapper[4691]: W1202 07:48:06.208467 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f46cd7e_1b0b_4ad2_bdce_abee324bcdd5.slice/crio-1b89d47de60ad6ca27a9f07f46b9320970cf7b4f794a7d55bc4e85803fe5638d WatchSource:0}: Error finding container 1b89d47de60ad6ca27a9f07f46b9320970cf7b4f794a7d55bc4e85803fe5638d: Status 404 returned error can't find the container with id 1b89d47de60ad6ca27a9f07f46b9320970cf7b4f794a7d55bc4e85803fe5638d Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.210818 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.221559 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-tbm97"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.222222 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.222752 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.722740961 +0000 UTC m=+134.506819823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.223117 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.229945 4691 request.go:700] Waited for 1.905905594s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.231028 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 07:48:06 crc kubenswrapper[4691]: W1202 07:48:06.237391 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b28cfe7_f3d6_4ff9_b1ae_371fdf083acf.slice/crio-264cbf5efc7cba8138f34d1d56c479e0328a7920e5aa2b9cdcb989a023c3245e WatchSource:0}: Error finding container 264cbf5efc7cba8138f34d1d56c479e0328a7920e5aa2b9cdcb989a023c3245e: Status 404 returned error can't find the container with id 264cbf5efc7cba8138f34d1d56c479e0328a7920e5aa2b9cdcb989a023c3245e Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.265748 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bks2b\" (UniqueName: \"kubernetes.io/projected/ad8eb845-25ea-4917-80a0-46e84ed6ef97-kube-api-access-bks2b\") pod \"authentication-operator-69f744f599-jkx5c\" (UID: \"ad8eb845-25ea-4917-80a0-46e84ed6ef97\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.282743 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.285868 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzrp\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-kube-api-access-pfzrp\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.299923 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.323805 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.324356 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.824339603 +0000 UTC m=+134.608418465 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.325158 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqlv2\" (UniqueName: \"kubernetes.io/projected/677bf100-9036-4b58-9658-6b918304ba47-kube-api-access-fqlv2\") pod \"machine-api-operator-5694c8668f-qvdbg\" (UID: \"677bf100-9036-4b58-9658-6b918304ba47\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.329897 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-bound-sa-token\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.330276 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.347590 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5qq7\" (UniqueName: \"kubernetes.io/projected/42a8f203-7e80-4e55-bd79-afe843279906-kube-api-access-c5qq7\") pod \"cluster-samples-operator-665b6dd947-8kj4z\" (UID: \"42a8f203-7e80-4e55-bd79-afe843279906\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.350189 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.366411 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpdrb\" (UniqueName: \"kubernetes.io/projected/c4d13548-f8dc-4a70-a287-9bee33dd7dd4-kube-api-access-vpdrb\") pod \"apiserver-76f77b778f-ng6l4\" (UID: \"c4d13548-f8dc-4a70-a287-9bee33dd7dd4\") " pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.388187 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxxt\" (UniqueName: \"kubernetes.io/projected/5c7ffef5-c616-4596-aab4-6daa6eed6d46-kube-api-access-twxxt\") pod \"console-operator-58897d9998-tcd8d\" (UID: \"5c7ffef5-c616-4596-aab4-6daa6eed6d46\") " pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.406389 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd9dk\" (UniqueName: \"kubernetes.io/projected/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-kube-api-access-kd9dk\") pod \"oauth-openshift-558db77b4-mcdgg\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.429392 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.429478 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drtn4\" (UniqueName: \"kubernetes.io/projected/4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c-kube-api-access-drtn4\") pod \"apiserver-7bbb656c7d-nqndh\" (UID: \"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.429895 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:06.929872974 +0000 UTC m=+134.713951836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.441856 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.451646 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jsqnn"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.454910 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkm8d\" (UniqueName: \"kubernetes.io/projected/eb7b09b1-f092-4e2b-8b07-e03343753503-kube-api-access-wkm8d\") pod \"controller-manager-879f6c89f-87dnb\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.463209 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.474334 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2bxr\" (UniqueName: \"kubernetes.io/projected/011633b2-37cb-46bd-b120-a9a9023d40fb-kube-api-access-m2bxr\") pod \"openshift-apiserver-operator-796bbdcf4f-blfck\" (UID: \"011633b2-37cb-46bd-b120-a9a9023d40fb\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.477065 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.489826 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bgj9\" (UniqueName: \"kubernetes.io/projected/1e8a5b37-f843-459c-93ca-379044dbbec4-kube-api-access-7bgj9\") pod \"openshift-controller-manager-operator-756b6f6bc6-7ff8p\" (UID: \"1e8a5b37-f843-459c-93ca-379044dbbec4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.496162 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.503830 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.511149 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.511539 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.523314 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-jkx5c"] Dec 02 07:48:06 crc kubenswrapper[4691]: W1202 07:48:06.530718 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb7d7a68_80e4_4e9f_a7b4_adcc14282d4d.slice/crio-421d401a05def654fffaeb28504c280b617a973eac83f0edf3be28affc5440ed WatchSource:0}: Error finding container 421d401a05def654fffaeb28504c280b617a973eac83f0edf3be28affc5440ed: Status 404 returned error can't find the container with id 421d401a05def654fffaeb28504c280b617a973eac83f0edf3be28affc5440ed Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.530898 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.531132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.531535 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.031515126 +0000 UTC m=+134.815593998 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.549592 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.552267 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.572092 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.575446 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.591502 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 07:48:06 crc kubenswrapper[4691]: W1202 07:48:06.621166 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea5aa03a_f69e_4e94_8586_de42593bce47.slice/crio-18196a9d7ff9a3781e07d6f0821292bc591f19190871b778ca23e156d4455ad1 WatchSource:0}: Error finding container 18196a9d7ff9a3781e07d6f0821292bc591f19190871b778ca23e156d4455ad1: Status 404 returned error can't find the container with id 18196a9d7ff9a3781e07d6f0821292bc591f19190871b778ca23e156d4455ad1 Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.624266 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.627565 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x7pm\" (UniqueName: \"kubernetes.io/projected/ce16c153-56c1-4d91-92db-e52b045186bb-kube-api-access-4x7pm\") pod \"service-ca-operator-777779d784-p896d\" (UID: \"ce16c153-56c1-4d91-92db-e52b045186bb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.633564 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.634308 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.134291578 +0000 UTC m=+134.918370440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.650424 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sx8m\" (UniqueName: \"kubernetes.io/projected/1eb687f4-51f6-4806-b5a0-e35639b4b019-kube-api-access-6sx8m\") pod \"csi-hostpathplugin-2pv42\" (UID: \"1eb687f4-51f6-4806-b5a0-e35639b4b019\") " pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.653446 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.662415 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.675744 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nlk6\" (UniqueName: \"kubernetes.io/projected/27fa245f-f203-4dfd-9ca1-78bc7c9e17a6-kube-api-access-4nlk6\") pod \"ingress-canary-hs9sj\" (UID: \"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6\") " pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.702590 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6rbn\" (UniqueName: \"kubernetes.io/projected/5d2fe571-0e98-42f1-8f71-5cdc773ec89e-kube-api-access-w6rbn\") pod \"multus-admission-controller-857f4d67dd-lb7g9\" (UID: \"5d2fe571-0e98-42f1-8f71-5cdc773ec89e\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.705249 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkwgt\" (UniqueName: \"kubernetes.io/projected/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-kube-api-access-rkwgt\") pod \"console-f9d7485db-wx6m2\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.713030 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.726513 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rksj\" (UniqueName: \"kubernetes.io/projected/46a270d2-43de-41d0-bb1b-dc02b1a28d3a-kube-api-access-8rksj\") pod \"router-default-5444994796-h7dkw\" (UID: \"46a270d2-43de-41d0-bb1b-dc02b1a28d3a\") " pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.736452 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.737026 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.237006837 +0000 UTC m=+135.021085699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.751240 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh2b4\" (UniqueName: \"kubernetes.io/projected/54581392-8f19-498b-b24d-c35064382946-kube-api-access-hh2b4\") pod \"machine-config-operator-74547568cd-7pvc7\" (UID: \"54581392-8f19-498b-b24d-c35064382946\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.758609 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.764737 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql5t7\" (UniqueName: \"kubernetes.io/projected/c8c8387e-bdc5-4d7a-ae05-776786ee7277-kube-api-access-ql5t7\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.787705 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ec8623c-5335-43a5-8431-49e66d87b4f5-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8mw2g\" (UID: \"4ec8623c-5335-43a5-8431-49e66d87b4f5\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.807139 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.813784 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcvzn\" (UniqueName: \"kubernetes.io/projected/39b63660-5845-41eb-94ca-ffc8ccb34413-kube-api-access-xcvzn\") pod \"control-plane-machine-set-operator-78cbb6b69f-8x9pv\" (UID: \"39b63660-5845-41eb-94ca-ffc8ccb34413\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.824778 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls22m\" (UniqueName: \"kubernetes.io/projected/dfefdd27-4ee7-4194-990d-0199fbe83a47-kube-api-access-ls22m\") pod \"etcd-operator-b45778765-jlb9l\" (UID: \"dfefdd27-4ee7-4194-990d-0199fbe83a47\") " pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.828214 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.841492 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.841924 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.341887142 +0000 UTC m=+135.125965994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.850525 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.853358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2b6t\" (UniqueName: \"kubernetes.io/projected/20e58619-0441-4d0e-9542-b2b8948099ef-kube-api-access-z2b6t\") pod \"machine-config-server-8wz4g\" (UID: \"20e58619-0441-4d0e-9542-b2b8948099ef\") " pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.868465 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lkjj\" (UniqueName: \"kubernetes.io/projected/33710686-f504-4a80-a2d4-e42b0bfb3033-kube-api-access-4lkjj\") pod \"package-server-manager-789f6589d5-5qs4n\" (UID: \"33710686-f504-4a80-a2d4-e42b0bfb3033\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.874514 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.892860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw5cj\" (UniqueName: \"kubernetes.io/projected/483d25f5-c04d-4c89-a260-59cc01d12255-kube-api-access-qw5cj\") pod \"packageserver-d55dfcdfc-tl75k\" (UID: \"483d25f5-c04d-4c89-a260-59cc01d12255\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.908403 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.920517 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8wz4g" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.928633 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4244w\" (UniqueName: \"kubernetes.io/projected/688a963d-2808-4961-a584-1ee4a3ada61d-kube-api-access-4244w\") pod \"marketplace-operator-79b997595-sv92r\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.934893 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hs9sj" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.934988 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qvdbg"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.935399 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscd5\" (UniqueName: \"kubernetes.io/projected/366b4073-3bd7-4d0f-ad0e-713f71cd8b5c-kube-api-access-zscd5\") pod \"dns-default-jcpp8\" (UID: \"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c\") " pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.943570 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.943853 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.443823262 +0000 UTC m=+135.227902124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.943921 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:06 crc kubenswrapper[4691]: E1202 07:48:06.944482 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.444472099 +0000 UTC m=+135.228550961 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.964248 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bqr\" (UniqueName: \"kubernetes.io/projected/3cf0e12e-115d-4ac5-b646-6ac98524d948-kube-api-access-49bqr\") pod \"migrator-59844c95c7-4ms2c\" (UID: \"3cf0e12e-115d-4ac5-b646-6ac98524d948\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.966438 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh"] Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.980964 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh9k4\" (UniqueName: \"kubernetes.io/projected/1b9abc34-1981-4f2b-b827-5347ed0f3d7e-kube-api-access-lh9k4\") pod \"olm-operator-6b444d44fb-x96cz\" (UID: \"1b9abc34-1981-4f2b-b827-5347ed0f3d7e\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:06 crc kubenswrapper[4691]: I1202 07:48:06.992837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a0f413a7-cc30-430c-a9a9-b7eb6da2916d-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-6kpnf\" (UID: \"a0f413a7-cc30-430c-a9a9-b7eb6da2916d\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:07 crc kubenswrapper[4691]: W1202 07:48:07.003916 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677bf100_9036_4b58_9658_6b918304ba47.slice/crio-6a3aa4645e75267cb5dc37fc80f715091028b972a45e00584d11fc0fc93b8edd WatchSource:0}: Error finding container 6a3aa4645e75267cb5dc37fc80f715091028b972a45e00584d11fc0fc93b8edd: Status 404 returned error can't find the container with id 6a3aa4645e75267cb5dc37fc80f715091028b972a45e00584d11fc0fc93b8edd Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.005891 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-87dnb"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.019969 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.026889 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ng6l4"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.044834 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.045566 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.045792 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.545747751 +0000 UTC m=+135.329826623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.046041 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.046375 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.546365337 +0000 UTC m=+135.330444199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.048886 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8c8387e-bdc5-4d7a-ae05-776786ee7277-bound-sa-token\") pod \"ingress-operator-5b745b69d9-b28bc\" (UID: \"c8c8387e-bdc5-4d7a-ae05-776786ee7277\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.051364 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnjdw\" (UniqueName: \"kubernetes.io/projected/01d8467f-e617-4b68-ada8-440891bb4b51-kube-api-access-wnjdw\") pod \"catalog-operator-68c6474976-vgwkd\" (UID: \"01d8467f-e617-4b68-ada8-440891bb4b51\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.051468 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.052139 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86sks\" (UniqueName: \"kubernetes.io/projected/943a92a5-ba00-456a-83f4-c383e252288a-kube-api-access-86sks\") pod \"collect-profiles-29411025-88q9j\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.064370 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.074708 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.083293 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.084450 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpg4q\" (UniqueName: \"kubernetes.io/projected/2c561340-bdf6-4d50-85c4-10a9098e12b7-kube-api-access-bpg4q\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hffd\" (UID: \"2c561340-bdf6-4d50-85c4-10a9098e12b7\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.097182 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.100722 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.121558 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.122203 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.124133 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f24830de-18e3-4204-8996-e7f1b0d45aec-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hd92g\" (UID: \"f24830de-18e3-4204-8996-e7f1b0d45aec\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.135324 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.146874 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.147036 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.647020514 +0000 UTC m=+135.431099376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.147263 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.147665 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.64765357 +0000 UTC m=+135.431732432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.158578 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t58tg\" (UniqueName: \"kubernetes.io/projected/0d511610-c204-4e20-a95e-43a7b41332b8-kube-api-access-t58tg\") pod \"service-ca-9c57cc56f-ncvcc\" (UID: \"0d511610-c204-4e20-a95e-43a7b41332b8\") " pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.159865 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzsqc\" (UniqueName: \"kubernetes.io/projected/349309ae-e421-4055-96a2-0480c5562853-kube-api-access-lzsqc\") pod \"machine-config-controller-84d6567774-nkzgs\" (UID: \"349309ae-e421-4055-96a2-0480c5562853\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.178656 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.184805 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.218549 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tcd8d"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.227812 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.237992 4691 generic.go:334] "Generic (PLEG): container finished" podID="d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6" containerID="89d71d24a287fbe19ff3b98e31f214a49908480a67a8d65eae6f1369a09efdcb" exitCode=0 Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.238749 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" event={"ID":"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6","Type":"ContainerDied","Data":"89d71d24a287fbe19ff3b98e31f214a49908480a67a8d65eae6f1369a09efdcb"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.238802 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" event={"ID":"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6","Type":"ContainerStarted","Data":"f067e7ad0b3a8eb77ef75cbb9fdc56f1f5dbbedd68dfdbb7f1e1c638e1146811"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.249024 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.249459 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.749444056 +0000 UTC m=+135.533522918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.253708 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mcdgg"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.256220 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" event={"ID":"ea5aa03a-f69e-4e94-8586-de42593bce47","Type":"ContainerStarted","Data":"c0f5ab488dad468beb72deb8f9314317982cf2cc0e9367198c1c1b95bafc427a"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.256300 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" event={"ID":"ea5aa03a-f69e-4e94-8586-de42593bce47","Type":"ContainerStarted","Data":"18196a9d7ff9a3781e07d6f0821292bc591f19190871b778ca23e156d4455ad1"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.256803 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.257968 4691 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-bdlzb container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.258014 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" podUID="ea5aa03a-f69e-4e94-8586-de42593bce47" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.258417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" event={"ID":"eb7b09b1-f092-4e2b-8b07-e03343753503","Type":"ContainerStarted","Data":"5ff4b82205f9feca9c0e16717be9da438f0f134d443e7583fa199adf9ad23ea9"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.259365 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h7dkw" event={"ID":"46a270d2-43de-41d0-bb1b-dc02b1a28d3a","Type":"ContainerStarted","Data":"ac078a4397dcecda5a672cd97a09dfa6929fe07ee1e99ef64aaa9992324e0e8e"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.273307 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" event={"ID":"dcf2998e-5d89-4f76-afc0-490573eae9d8","Type":"ContainerStarted","Data":"c8fa6474ed19c3087161877aa46344d34365bc4350b7701f658a5c0a2aa7e47f"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.273335 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" event={"ID":"dcf2998e-5d89-4f76-afc0-490573eae9d8","Type":"ContainerStarted","Data":"0256138a7e4b904d04dd3471a0cfb00d0b184ab0f3af0f5fd80b208b79fa5ea6"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.293629 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" event={"ID":"677bf100-9036-4b58-9658-6b918304ba47","Type":"ContainerStarted","Data":"6a3aa4645e75267cb5dc37fc80f715091028b972a45e00584d11fc0fc93b8edd"} Dec 02 07:48:07 crc kubenswrapper[4691]: W1202 07:48:07.326362 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8a5b37_f843_459c_93ca_379044dbbec4.slice/crio-997772876b1618ab192b184ee13367ecf7356c5d2a4da105e91b550e88e8c3ca WatchSource:0}: Error finding container 997772876b1618ab192b184ee13367ecf7356c5d2a4da105e91b550e88e8c3ca: Status 404 returned error can't find the container with id 997772876b1618ab192b184ee13367ecf7356c5d2a4da105e91b550e88e8c3ca Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.326960 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" event={"ID":"ad8eb845-25ea-4917-80a0-46e84ed6ef97","Type":"ContainerStarted","Data":"1ef64405fcafe181275b0ae8b55578cf7fb1cd1a4e74c0fbd1ea85c8cd93f881"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.327003 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" event={"ID":"ad8eb845-25ea-4917-80a0-46e84ed6ef97","Type":"ContainerStarted","Data":"97394b7f62e8ad1bf0f9eaf4a647e24f504886053050ecfe511ec1b3ed046fb6"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.327075 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.331259 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.346038 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.350646 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.351059 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.851047527 +0000 UTC m=+135.635126379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.375286 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" event={"ID":"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5","Type":"ContainerStarted","Data":"14288ce09c9ddb1762b931f9433b292452c70293b959debbece8710332d2b83d"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.375331 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" event={"ID":"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5","Type":"ContainerStarted","Data":"13512e49cc3adb21fb69720378ca1487399dd915bad96a5d80998330f39e47fd"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.375340 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" event={"ID":"7f46cd7e-1b0b-4ad2-bdce-abee324bcdd5","Type":"ContainerStarted","Data":"1b89d47de60ad6ca27a9f07f46b9320970cf7b4f794a7d55bc4e85803fe5638d"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.389032 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.390373 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" event={"ID":"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c","Type":"ContainerStarted","Data":"1e0c06c1c77a6c519b4b8e71e0197932a2f6e53b8ef4c6daccd5a654152164f7"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.443024 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.458450 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.459704 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:07.959688168 +0000 UTC m=+135.743767030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.466689 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" event={"ID":"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf","Type":"ContainerStarted","Data":"00de01f6c69379bee9bc1e79fffcd55a0358afc7e1e1641eb2f5ade797bde058"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.466742 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" event={"ID":"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf","Type":"ContainerStarted","Data":"7fc0c4be6ff34a1108add1b0dd356afc5493820a330a5ac407830c0a40c91f96"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.466772 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" event={"ID":"4b28cfe7-f3d6-4ff9-b1ae-371fdf083acf","Type":"ContainerStarted","Data":"264cbf5efc7cba8138f34d1d56c479e0328a7920e5aa2b9cdcb989a023c3245e"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.494213 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jsqnn" event={"ID":"db7d7a68-80e4-4e9f-a7b4-adcc14282d4d","Type":"ContainerStarted","Data":"d8c9c498299847c3e7917fded008dd54b4c97c469e0aba24fed76fe9d24928b5"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.494259 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jsqnn" event={"ID":"db7d7a68-80e4-4e9f-a7b4-adcc14282d4d","Type":"ContainerStarted","Data":"421d401a05def654fffaeb28504c280b617a973eac83f0edf3be28affc5440ed"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.495207 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.500354 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-jlb9l"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.503043 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-jsqnn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.503348 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jsqnn" podUID="db7d7a68-80e4-4e9f-a7b4-adcc14282d4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.510006 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" event={"ID":"c4d13548-f8dc-4a70-a287-9bee33dd7dd4","Type":"ContainerStarted","Data":"6a1d85f7fbe88281d5d9b951c7996b2987fef415fd132bf661471ab3e61f15fb"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.518634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" event={"ID":"42a8f203-7e80-4e55-bd79-afe843279906","Type":"ContainerStarted","Data":"59000b946569e672c97620cb465997aa21eec428d24b49ce3b93bebf6c8e6ac7"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.522896 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8wz4g" event={"ID":"20e58619-0441-4d0e-9542-b2b8948099ef","Type":"ContainerStarted","Data":"e4b7ef2a92d63f04a3fd0a390a51b236695315fed600f176f5b3b663c0fb7abe"} Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.541725 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-wx6m2"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.546466 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-lb7g9"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.553909 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-p896d"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.560029 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.561271 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.061259339 +0000 UTC m=+135.845338201 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.661856 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.662981 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.162965603 +0000 UTC m=+135.947044455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.778139 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.778925 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.278911641 +0000 UTC m=+136.062990503 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.801270 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hs9sj"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.828583 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.832605 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-2pv42"] Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.850897 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jsqnn" podStartSLOduration=117.850879453 podStartE2EDuration="1m57.850879453s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:07.845824334 +0000 UTC m=+135.629903196" watchObservedRunningTime="2025-12-02 07:48:07.850879453 +0000 UTC m=+135.634958315" Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.881302 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.881620 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.38160499 +0000 UTC m=+136.165683852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:07 crc kubenswrapper[4691]: I1202 07:48:07.984682 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:07 crc kubenswrapper[4691]: E1202 07:48:07.985420 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.485383806 +0000 UTC m=+136.269462678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.088620 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.088940 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.588926037 +0000 UTC m=+136.373004899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.113108 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" podStartSLOduration=118.113085465 podStartE2EDuration="1m58.113085465s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:08.05546517 +0000 UTC m=+135.839544042" watchObservedRunningTime="2025-12-02 07:48:08.113085465 +0000 UTC m=+135.897164327" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.113345 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-j4hpv" podStartSLOduration=118.113339352 podStartE2EDuration="1m58.113339352s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:08.088474395 +0000 UTC m=+135.872553257" watchObservedRunningTime="2025-12-02 07:48:08.113339352 +0000 UTC m=+135.897418214" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.190923 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.191691 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.691675497 +0000 UTC m=+136.475754359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: W1202 07:48:08.195506 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ec8623c_5335_43a5_8431_49e66d87b4f5.slice/crio-2b2eeab98c07552e320778413fcf4939a35a8d325013f26415611de2badb9246 WatchSource:0}: Error finding container 2b2eeab98c07552e320778413fcf4939a35a8d325013f26415611de2badb9246: Status 404 returned error can't find the container with id 2b2eeab98c07552e320778413fcf4939a35a8d325013f26415611de2badb9246 Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.293890 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.294289 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.794273334 +0000 UTC m=+136.578352196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.395387 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.396024 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:08.896012989 +0000 UTC m=+136.680091841 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.490814 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-ncsfj" podStartSLOduration=118.490792895 podStartE2EDuration="1m58.490792895s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:08.465731093 +0000 UTC m=+136.249809955" watchObservedRunningTime="2025-12-02 07:48:08.490792895 +0000 UTC m=+136.274871757" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.505434 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.505684 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.005663666 +0000 UTC m=+136.789742528 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.505820 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.506091 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.006077756 +0000 UTC m=+136.790156618 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.538724 4691 generic.go:334] "Generic (PLEG): container finished" podID="4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c" containerID="e5799ce00f3cf9b6363e3a704db5ceeb31e2df5224fdea73cef3f4dc7c319300" exitCode=0 Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.538824 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" event={"ID":"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c","Type":"ContainerDied","Data":"e5799ce00f3cf9b6363e3a704db5ceeb31e2df5224fdea73cef3f4dc7c319300"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.543029 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" event={"ID":"eb7b09b1-f092-4e2b-8b07-e03343753503","Type":"ContainerStarted","Data":"8dd7dbb6b386629f42362837e1ccd1f13f2f2de51b389168dc847c9d4ba16374"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.545867 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.546561 4691 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-87dnb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.546598 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" podUID="eb7b09b1-f092-4e2b-8b07-e03343753503" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.556536 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz"] Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.622659 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.623999 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.123978955 +0000 UTC m=+136.908057817 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.682962 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8wz4g" event={"ID":"20e58619-0441-4d0e-9542-b2b8948099ef","Type":"ContainerStarted","Data":"9b6df5885afe15d35d9ef5b646b2d790ca5f7241694032aa209757c24cc749e4"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683034 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n"] Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683057 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7"] Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683124 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd"] Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683163 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" event={"ID":"1e8a5b37-f843-459c-93ca-379044dbbec4","Type":"ContainerStarted","Data":"a3e4a8ef08e50f350877f93f74c398f42b22bbfc8f97b0d21ac2b10bd1a628d1"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683219 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" event={"ID":"1e8a5b37-f843-459c-93ca-379044dbbec4","Type":"ContainerStarted","Data":"997772876b1618ab192b184ee13367ecf7356c5d2a4da105e91b550e88e8c3ca"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683244 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" event={"ID":"1eb687f4-51f6-4806-b5a0-e35639b4b019","Type":"ContainerStarted","Data":"0c385bdc5abaaacdc47f65386d9b747ddf909f1e55e40fd37d4e9e76f8db6253"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.683256 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" event={"ID":"4ec8623c-5335-43a5-8431-49e66d87b4f5","Type":"ContainerStarted","Data":"2b2eeab98c07552e320778413fcf4939a35a8d325013f26415611de2badb9246"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.684472 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" event={"ID":"ce16c153-56c1-4d91-92db-e52b045186bb","Type":"ContainerStarted","Data":"d6baeb86b5f9beb2ab1e15ea16004ed635c55ab74a81b7827fb4c77724351ca0"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.692166 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-h7dkw" event={"ID":"46a270d2-43de-41d0-bb1b-dc02b1a28d3a","Type":"ContainerStarted","Data":"c32d68a5647724c89dd42cf99c1db059684b09900f8c408c880b09ee297b6361"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.693328 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" event={"ID":"dfefdd27-4ee7-4194-990d-0199fbe83a47","Type":"ContainerStarted","Data":"4e1b29298c4dcada3b5cdf9ecf84bd93211e7bc69f110d46e6bc4745d1073922"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.718580 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" event={"ID":"5d2fe571-0e98-42f1-8f71-5cdc773ec89e","Type":"ContainerStarted","Data":"25aec108c788b2b9e7ab5075b5e31d2a3229cc43ae44d360d36577357bd4b05e"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.730353 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.731629 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.23159293 +0000 UTC m=+137.015671792 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.771539 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.780936 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" event={"ID":"42a8f203-7e80-4e55-bd79-afe843279906","Type":"ContainerStarted","Data":"8d8006d4f8ebb8ed4682a23719cecf3b9bd0d8ed028821b2f1143d3d199f45c8"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.791128 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:08 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:08 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:08 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.791194 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.815419 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" event={"ID":"5c7ffef5-c616-4596-aab4-6daa6eed6d46","Type":"ContainerStarted","Data":"d14ad50efcdc72db005fec410c960b64c82e76fca9f4a0a8514e94b67d28b709"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.815507 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" event={"ID":"5c7ffef5-c616-4596-aab4-6daa6eed6d46","Type":"ContainerStarted","Data":"bf3e75a8b3bc19f93567c243d5232ba3c7a5776d48e0ad0d206bf6e051da149f"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.817893 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.820947 4691 patch_prober.go:28] interesting pod/console-operator-58897d9998-tcd8d container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.821016 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" podUID="5c7ffef5-c616-4596-aab4-6daa6eed6d46" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.822116 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hs9sj" event={"ID":"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6","Type":"ContainerStarted","Data":"f63554dcf23a7d9eb8cb543bce71f800dc669f0e07abc963a5e4d9c8076ac602"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.839539 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.843689 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.343673149 +0000 UTC m=+137.127752011 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.851624 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" event={"ID":"677bf100-9036-4b58-9658-6b918304ba47","Type":"ContainerStarted","Data":"5f0c796d6751493c77a07cac376fd39ba391f0f38546391322b92a816ee6bb51"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.851856 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" event={"ID":"677bf100-9036-4b58-9658-6b918304ba47","Type":"ContainerStarted","Data":"25cbbe6f0417319ee346f14b6c6afa9d3e71cecb8a82da68e864abf479ae10a5"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.904201 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" event={"ID":"011633b2-37cb-46bd-b120-a9a9023d40fb","Type":"ContainerStarted","Data":"3686a9fa8babac5f2abfe0b3eae65c98114585d9777b4c438c05e9aca778b652"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.904506 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" event={"ID":"011633b2-37cb-46bd-b120-a9a9023d40fb","Type":"ContainerStarted","Data":"3e8e213c176f5d398229bee7c1a0d8a0e5a567c0abaf99ac1602692cb46c1457"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.908290 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" event={"ID":"c4d13548-f8dc-4a70-a287-9bee33dd7dd4","Type":"ContainerStarted","Data":"d8b4d076c9743aaef7ee26203929b7f80a8ef69d771b0e932b6ae6c70c773fa3"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.916451 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" event={"ID":"d9bf02c0-4f72-4d0c-8355-b0aa05b02ec6","Type":"ContainerStarted","Data":"b0b4b0bf6a3d6f3bc99acf217915458a7e65b9582abadf73faa600189d6cf362"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.916487 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.925215 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" event={"ID":"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9","Type":"ContainerStarted","Data":"bc08c90ea1ca6d240504ac103cb6d917c914414f2b8dcaf6d17426d6a73554f1"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.929020 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wx6m2" event={"ID":"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4","Type":"ContainerStarted","Data":"ee21421ae4fb6db73c078384220c18aad2b38ce487892fc3d4db60dc0f8bb1c1"} Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.931701 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-jsqnn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.931730 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jsqnn" podUID="db7d7a68-80e4-4e9f-a7b4-adcc14282d4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.934646 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-jkx5c" podStartSLOduration=118.934623018 podStartE2EDuration="1m58.934623018s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:08.932550765 +0000 UTC m=+136.716629627" watchObservedRunningTime="2025-12-02 07:48:08.934623018 +0000 UTC m=+136.718701880" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.944289 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.944939 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:08 crc kubenswrapper[4691]: E1202 07:48:08.947221 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.44720977 +0000 UTC m=+137.231288622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:08 crc kubenswrapper[4691]: I1202 07:48:08.987209 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-tbm97" podStartSLOduration=118.987176803 podStartE2EDuration="1m58.987176803s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:08.976307655 +0000 UTC m=+136.760386517" watchObservedRunningTime="2025-12-02 07:48:08.987176803 +0000 UTC m=+136.771255665" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.025739 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.047567 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.048740 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.548721489 +0000 UTC m=+137.332800351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.056211 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jcpp8"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.070831 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.075450 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7ff8p" podStartSLOduration=119.075433813 podStartE2EDuration="1m59.075433813s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.07415051 +0000 UTC m=+136.858229372" watchObservedRunningTime="2025-12-02 07:48:09.075433813 +0000 UTC m=+136.859512675" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.087947 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.096013 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sv92r"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.098138 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.102449 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" podStartSLOduration=119.102436764 podStartE2EDuration="1m59.102436764s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.089719419 +0000 UTC m=+136.873798281" watchObservedRunningTime="2025-12-02 07:48:09.102436764 +0000 UTC m=+136.886515626" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.108547 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ncvcc"] Dec 02 07:48:09 crc kubenswrapper[4691]: W1202 07:48:09.112419 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39b63660_5845_41eb_94ca_ffc8ccb34413.slice/crio-d61e99ce44208ebabb5bb56694abe8cd67488e3c1e3d71056fb096caf0e9ebaa WatchSource:0}: Error finding container d61e99ce44208ebabb5bb56694abe8cd67488e3c1e3d71056fb096caf0e9ebaa: Status 404 returned error can't find the container with id d61e99ce44208ebabb5bb56694abe8cd67488e3c1e3d71056fb096caf0e9ebaa Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.121778 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" podStartSLOduration=119.121746788 podStartE2EDuration="1m59.121746788s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.121584844 +0000 UTC m=+136.905663706" watchObservedRunningTime="2025-12-02 07:48:09.121746788 +0000 UTC m=+136.905825640" Dec 02 07:48:09 crc kubenswrapper[4691]: W1202 07:48:09.147964 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod943a92a5_ba00_456a_83f4_c383e252288a.slice/crio-c0c1dccf35ef668a603436db4ffac5c5d58e50d6d5a427dd5a0c8554261e43e8 WatchSource:0}: Error finding container c0c1dccf35ef668a603436db4ffac5c5d58e50d6d5a427dd5a0c8554261e43e8: Status 404 returned error can't find the container with id c0c1dccf35ef668a603436db4ffac5c5d58e50d6d5a427dd5a0c8554261e43e8 Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.148737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.149293 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.649274663 +0000 UTC m=+137.433353525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: W1202 07:48:09.150774 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod688a963d_2808_4961_a584_1ee4a3ada61d.slice/crio-73f7ae1975982d278b7f710642b6a634f02d8ab9a80b112c5428a80f69b87e5b WatchSource:0}: Error finding container 73f7ae1975982d278b7f710642b6a634f02d8ab9a80b112c5428a80f69b87e5b: Status 404 returned error can't find the container with id 73f7ae1975982d278b7f710642b6a634f02d8ab9a80b112c5428a80f69b87e5b Dec 02 07:48:09 crc kubenswrapper[4691]: W1202 07:48:09.151013 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d511610_c204_4e20_a95e_43a7b41332b8.slice/crio-05c6a8e6869226cb216fb387757f927a12b722c013053b41f3d15a915b457029 WatchSource:0}: Error finding container 05c6a8e6869226cb216fb387757f927a12b722c013053b41f3d15a915b457029: Status 404 returned error can't find the container with id 05c6a8e6869226cb216fb387757f927a12b722c013053b41f3d15a915b457029 Dec 02 07:48:09 crc kubenswrapper[4691]: W1202 07:48:09.155223 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod483d25f5_c04d_4c89_a260_59cc01d12255.slice/crio-15b95a58d2680712dcc361a3feffdd647a11a95968a00afedf9b41bd63402e1b WatchSource:0}: Error finding container 15b95a58d2680712dcc361a3feffdd647a11a95968a00afedf9b41bd63402e1b: Status 404 returned error can't find the container with id 15b95a58d2680712dcc361a3feffdd647a11a95968a00afedf9b41bd63402e1b Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.163075 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qvdbg" podStartSLOduration=119.163064816 podStartE2EDuration="1m59.163064816s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.16124419 +0000 UTC m=+136.945323052" watchObservedRunningTime="2025-12-02 07:48:09.163064816 +0000 UTC m=+136.947143678" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.250246 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.250528 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.750511405 +0000 UTC m=+137.534590267 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.267430 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" podStartSLOduration=119.267410978 podStartE2EDuration="1m59.267410978s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.25695226 +0000 UTC m=+137.041031122" watchObservedRunningTime="2025-12-02 07:48:09.267410978 +0000 UTC m=+137.051489830" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.268963 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.280005 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-blfck" podStartSLOduration=119.27998825 podStartE2EDuration="1m59.27998825s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.279309842 +0000 UTC m=+137.063388714" watchObservedRunningTime="2025-12-02 07:48:09.27998825 +0000 UTC m=+137.064067112" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.325235 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-h7dkw" podStartSLOduration=119.325217738 podStartE2EDuration="1m59.325217738s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.322046086 +0000 UTC m=+137.106124958" watchObservedRunningTime="2025-12-02 07:48:09.325217738 +0000 UTC m=+137.109296600" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.329603 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.330939 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.335166 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.343821 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g"] Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.352021 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.352391 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.852375523 +0000 UTC m=+137.636454395 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: W1202 07:48:09.380992 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf24830de_18e3_4204_8996_e7f1b0d45aec.slice/crio-6494ef343af4e8dc90e9203411b95f54a39d2611544cd1af8f19333b7771d67b WatchSource:0}: Error finding container 6494ef343af4e8dc90e9203411b95f54a39d2611544cd1af8f19333b7771d67b: Status 404 returned error can't find the container with id 6494ef343af4e8dc90e9203411b95f54a39d2611544cd1af8f19333b7771d67b Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.453785 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8wz4g" podStartSLOduration=5.453767549 podStartE2EDuration="5.453767549s" podCreationTimestamp="2025-12-02 07:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:09.407120434 +0000 UTC m=+137.191199296" watchObservedRunningTime="2025-12-02 07:48:09.453767549 +0000 UTC m=+137.237846411" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.454995 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.455364 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:09.955350059 +0000 UTC m=+137.739428921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.556093 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.556773 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.056747825 +0000 UTC m=+137.840826687 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.663257 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.663750 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.163731454 +0000 UTC m=+137.947810316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.764919 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.765300 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.265280774 +0000 UTC m=+138.049359636 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.766020 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:09 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:09 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:09 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.766061 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.865961 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.866119 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.366092665 +0000 UTC m=+138.150171527 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.866258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.866532 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.366519366 +0000 UTC m=+138.150598228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.945689 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4d13548-f8dc-4a70-a287-9bee33dd7dd4" containerID="d8b4d076c9743aaef7ee26203929b7f80a8ef69d771b0e932b6ae6c70c773fa3" exitCode=0 Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.945827 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" event={"ID":"c4d13548-f8dc-4a70-a287-9bee33dd7dd4","Type":"ContainerDied","Data":"d8b4d076c9743aaef7ee26203929b7f80a8ef69d771b0e932b6ae6c70c773fa3"} Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.962093 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" event={"ID":"c8c8387e-bdc5-4d7a-ae05-776786ee7277","Type":"ContainerStarted","Data":"52a709a570bbda3046e39928b808a6cee6d8bbedbcad41389cc9e2482c9f1393"} Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.974063 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:09 crc kubenswrapper[4691]: E1202 07:48:09.974432 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.474402857 +0000 UTC m=+138.258481729 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.982482 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" event={"ID":"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9","Type":"ContainerStarted","Data":"1b3c8e6ffa91fe202799d5dff384bfae5c144da309ca88387bd099eddb4fe097"} Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.983498 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.984602 4691 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-mcdgg container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" start-of-body= Dec 02 07:48:09 crc kubenswrapper[4691]: I1202 07:48:09.984660 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" podUID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.13:6443/healthz\": dial tcp 10.217.0.13:6443: connect: connection refused" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.017248 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" podStartSLOduration=120.017229204 podStartE2EDuration="2m0.017229204s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.015065589 +0000 UTC m=+137.799144451" watchObservedRunningTime="2025-12-02 07:48:10.017229204 +0000 UTC m=+137.801308066" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.022578 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" event={"ID":"4b6e412f-cac4-4c37-89c7-2f3f56ec6e9c","Type":"ContainerStarted","Data":"02c009da520a900c3bf9cdb0494d7f4823dfbd8025dbb1b86dd64292921183cf"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.025875 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" event={"ID":"943a92a5-ba00-456a-83f4-c383e252288a","Type":"ContainerStarted","Data":"c0c1dccf35ef668a603436db4ffac5c5d58e50d6d5a427dd5a0c8554261e43e8"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.026587 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jcpp8" event={"ID":"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c","Type":"ContainerStarted","Data":"136fdbcf7a5ffb9cb2a06d8f60cbff1b0106ced6732c2373718dbe8e9752edf3"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.027302 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" event={"ID":"0d511610-c204-4e20-a95e-43a7b41332b8","Type":"ContainerStarted","Data":"05c6a8e6869226cb216fb387757f927a12b722c013053b41f3d15a915b457029"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.032918 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hs9sj" event={"ID":"27fa245f-f203-4dfd-9ca1-78bc7c9e17a6","Type":"ContainerStarted","Data":"9204c19d32f641e1ec6adb6f9b36a4cacc2666ff08dc0a217b89e3d57a76ccaa"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.034531 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" event={"ID":"349309ae-e421-4055-96a2-0480c5562853","Type":"ContainerStarted","Data":"ee2c0cbdb9fad112248278cef6f5c6648e939b419ef59174df5cd840ef3874d9"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.038126 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" event={"ID":"1b9abc34-1981-4f2b-b827-5347ed0f3d7e","Type":"ContainerStarted","Data":"acb1320b32b405721df44f572ae2eb07f6f9e7b7177ce3e62a63e94c54d88e85"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.038155 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" event={"ID":"1b9abc34-1981-4f2b-b827-5347ed0f3d7e","Type":"ContainerStarted","Data":"e03c6fd187927653ae7ecf0e0288751dd255fa75cf51f80a6f08462f63068ab7"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.039321 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.055556 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" event={"ID":"4ec8623c-5335-43a5-8431-49e66d87b4f5","Type":"ContainerStarted","Data":"f5a47c24bb8b06b79c2e80c43dd922839a1088f3265d8703938626d21f18dba7"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.063346 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.086704 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.087128 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.587115353 +0000 UTC m=+138.371194215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.109440 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" podStartSLOduration=120.109418524 podStartE2EDuration="2m0.109418524s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.094174444 +0000 UTC m=+137.878253306" watchObservedRunningTime="2025-12-02 07:48:10.109418524 +0000 UTC m=+137.893497386" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.121961 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" event={"ID":"33710686-f504-4a80-a2d4-e42b0bfb3033","Type":"ContainerStarted","Data":"67561171fad5b4845657c92574da7ac9213c604d0811c1c551835e441a5092b0"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.122012 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" event={"ID":"33710686-f504-4a80-a2d4-e42b0bfb3033","Type":"ContainerStarted","Data":"3dc5dc62ccbdea6b6ace0017261bbe7d97fb9d8e433705a1f5c62c7b2ee4fc52"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.143368 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8mw2g" podStartSLOduration=120.143346583 podStartE2EDuration="2m0.143346583s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.137986436 +0000 UTC m=+137.922065298" watchObservedRunningTime="2025-12-02 07:48:10.143346583 +0000 UTC m=+137.927425445" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.149977 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" event={"ID":"2c561340-bdf6-4d50-85c4-10a9098e12b7","Type":"ContainerStarted","Data":"d60c5c12706638eeef57bf7bfa35b2cbb152ebea1f8940e03c0fc16a481385c7"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.187418 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hs9sj" podStartSLOduration=6.18739388 podStartE2EDuration="6.18739388s" podCreationTimestamp="2025-12-02 07:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.173348381 +0000 UTC m=+137.957427233" watchObservedRunningTime="2025-12-02 07:48:10.18739388 +0000 UTC m=+137.971472742" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.188014 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.189347 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.6893315 +0000 UTC m=+138.473410362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.227513 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-x96cz" podStartSLOduration=120.227480037 podStartE2EDuration="2m0.227480037s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.226740748 +0000 UTC m=+138.010819600" watchObservedRunningTime="2025-12-02 07:48:10.227480037 +0000 UTC m=+138.011558899" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.236272 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" event={"ID":"42a8f203-7e80-4e55-bd79-afe843279906","Type":"ContainerStarted","Data":"6638b7651a270dc82711e6154f03c55fb0848cd0b9233f1b18b1a348c565edaf"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.255461 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wx6m2" event={"ID":"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4","Type":"ContainerStarted","Data":"f0d8a41f951868ea3425f4c830e96144fc129178e40d31649ba6dd18b201854e"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.262513 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" event={"ID":"a0f413a7-cc30-430c-a9a9-b7eb6da2916d","Type":"ContainerStarted","Data":"b075e8c52846da9ac86821d58bef80d32aada400320d5707422d9f3e32342a15"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.277899 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" event={"ID":"5d2fe571-0e98-42f1-8f71-5cdc773ec89e","Type":"ContainerStarted","Data":"954dcfa9abd16b3fa3dab53e1d963884da5b6b349c0053d7ebca8b6dd21d2015"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.300470 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.300839 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.800823754 +0000 UTC m=+138.584902616 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.305489 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-8kj4z" podStartSLOduration=120.305476473 podStartE2EDuration="2m0.305476473s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.263022747 +0000 UTC m=+138.047101609" watchObservedRunningTime="2025-12-02 07:48:10.305476473 +0000 UTC m=+138.089555335" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.306107 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" event={"ID":"f24830de-18e3-4204-8996-e7f1b0d45aec","Type":"ContainerStarted","Data":"6494ef343af4e8dc90e9203411b95f54a39d2611544cd1af8f19333b7771d67b"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.345544 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" event={"ID":"54581392-8f19-498b-b24d-c35064382946","Type":"ContainerStarted","Data":"338fa48b37cfc623b6f25e814c0f6e6b47942615afe72f5c117fa6ca67ad998d"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.345978 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" event={"ID":"54581392-8f19-498b-b24d-c35064382946","Type":"ContainerStarted","Data":"d7bffd4dc2462a3a84254b57f39d2c71392ba723176ce8c581b21ab826e2e0b4"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.359206 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" event={"ID":"483d25f5-c04d-4c89-a260-59cc01d12255","Type":"ContainerStarted","Data":"15b95a58d2680712dcc361a3feffdd647a11a95968a00afedf9b41bd63402e1b"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.360099 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.360952 4691 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-tl75k container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" start-of-body= Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.360989 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" podUID="483d25f5-c04d-4c89-a260-59cc01d12255" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.41:5443/healthz\": dial tcp 10.217.0.41:5443: connect: connection refused" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.374946 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" event={"ID":"688a963d-2808-4961-a584-1ee4a3ada61d","Type":"ContainerStarted","Data":"73f7ae1975982d278b7f710642b6a634f02d8ab9a80b112c5428a80f69b87e5b"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.382885 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" event={"ID":"39b63660-5845-41eb-94ca-ffc8ccb34413","Type":"ContainerStarted","Data":"d61e99ce44208ebabb5bb56694abe8cd67488e3c1e3d71056fb096caf0e9ebaa"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.395929 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-wx6m2" podStartSLOduration=120.395912669 podStartE2EDuration="2m0.395912669s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.305999927 +0000 UTC m=+138.090078789" watchObservedRunningTime="2025-12-02 07:48:10.395912669 +0000 UTC m=+138.179991531" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.396445 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" podStartSLOduration=120.396439382 podStartE2EDuration="2m0.396439382s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.393919228 +0000 UTC m=+138.177998090" watchObservedRunningTime="2025-12-02 07:48:10.396439382 +0000 UTC m=+138.180518244" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.396815 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" event={"ID":"3cf0e12e-115d-4ac5-b646-6ac98524d948","Type":"ContainerStarted","Data":"4b160a83e553d2af04e0bb9d3e1c5576b6712083cdcdcdd1333998be1eea859c"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.396862 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" event={"ID":"3cf0e12e-115d-4ac5-b646-6ac98524d948","Type":"ContainerStarted","Data":"6c09c3f28493aa5c2e9133830b3943efe34b3645eda7a068878ed499cba67441"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.404700 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.405984 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:10.905963296 +0000 UTC m=+138.690042158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.424667 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" event={"ID":"dfefdd27-4ee7-4194-990d-0199fbe83a47","Type":"ContainerStarted","Data":"6da892d7514a98ca7b75b4a5021ad0c31b7614ffebd80d1925bfbb5b2d58aa86"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.454663 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" podStartSLOduration=120.454646533 podStartE2EDuration="2m0.454646533s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.453192155 +0000 UTC m=+138.237271027" watchObservedRunningTime="2025-12-02 07:48:10.454646533 +0000 UTC m=+138.238725395" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.470735 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" event={"ID":"ce16c153-56c1-4d91-92db-e52b045186bb","Type":"ContainerStarted","Data":"dbe260f0b32e1f83bb790583fbb22dfd8f75d67d544e3be999da09a286261882"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.497852 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" event={"ID":"01d8467f-e617-4b68-ada8-440891bb4b51","Type":"ContainerStarted","Data":"cb076a0cffbb430a7d096860bbf548114a402e00aa8134de27396ef64d45ccff"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.497887 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" event={"ID":"01d8467f-e617-4b68-ada8-440891bb4b51","Type":"ContainerStarted","Data":"3fcc7df2432f2de6d113bcc2c598ae361a0fb6c416078bff2e4e802e345986fc"} Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.499052 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.499483 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-jsqnn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.499815 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jsqnn" podUID="db7d7a68-80e4-4e9f-a7b4-adcc14282d4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.508545 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.510378 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.010364189 +0000 UTC m=+138.794443051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.539025 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.539232 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-jlb9l" podStartSLOduration=120.539211777 podStartE2EDuration="2m0.539211777s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.519656427 +0000 UTC m=+138.303735299" watchObservedRunningTime="2025-12-02 07:48:10.539211777 +0000 UTC m=+138.323290659" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.553712 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.615878 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.616211 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.116186148 +0000 UTC m=+138.900265010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.616534 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.616978 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" podStartSLOduration=120.616963508 podStartE2EDuration="2m0.616963508s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.615954132 +0000 UTC m=+138.400033004" watchObservedRunningTime="2025-12-02 07:48:10.616963508 +0000 UTC m=+138.401042370" Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.618518 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.118506517 +0000 UTC m=+138.902585379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.710145 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-p896d" podStartSLOduration=120.710129063 podStartE2EDuration="2m0.710129063s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:10.662017991 +0000 UTC m=+138.446096853" watchObservedRunningTime="2025-12-02 07:48:10.710129063 +0000 UTC m=+138.494207925" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.718388 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.719268 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.219249937 +0000 UTC m=+139.003328799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.767151 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:10 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:10 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:10 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.767501 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.822083 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.822562 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.322546181 +0000 UTC m=+139.106625043 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:10 crc kubenswrapper[4691]: I1202 07:48:10.924389 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:10 crc kubenswrapper[4691]: E1202 07:48:10.925105 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.425085716 +0000 UTC m=+139.209164578 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.025987 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.026420 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.52640938 +0000 UTC m=+139.310488242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.132132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.132260 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.63223997 +0000 UTC m=+139.416318832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.132680 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.132934 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.632926727 +0000 UTC m=+139.417005589 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.238270 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.238469 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.738417798 +0000 UTC m=+139.522496660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.238557 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.239107 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.739100076 +0000 UTC m=+139.523178938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.342228 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.342930 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.842913903 +0000 UTC m=+139.626992765 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.444374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.444849 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:11.944831512 +0000 UTC m=+139.728910374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.498441 4691 patch_prober.go:28] interesting pod/console-operator-58897d9998-tcd8d container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.498504 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" podUID="5c7ffef5-c616-4596-aab4-6daa6eed6d46" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.504771 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.504818 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.508027 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jcpp8" event={"ID":"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c","Type":"ContainerStarted","Data":"41c83675aad322e5c2a73247c3c65f2699b03b580ddf5a3a17af5132ee4008f1"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.508073 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jcpp8" event={"ID":"366b4073-3bd7-4d0f-ad0e-713f71cd8b5c","Type":"ContainerStarted","Data":"054ccf62e57c4cda9524ff21188d68e842673ae08f8b46591665981678c34f69"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.508105 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.515518 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" event={"ID":"33710686-f504-4a80-a2d4-e42b0bfb3033","Type":"ContainerStarted","Data":"37f0fd9e714611471f809467e9790b30e099f12ff15f703522885f81d62b7230"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.515683 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.525555 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.525599 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" event={"ID":"688a963d-2808-4961-a584-1ee4a3ada61d","Type":"ContainerStarted","Data":"64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.525856 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.527034 4691 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sv92r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.527081 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.531927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" event={"ID":"943a92a5-ba00-456a-83f4-c383e252288a","Type":"ContainerStarted","Data":"15ce1a00fdad12a14dc02d5b96b66ab5ee6272b87015b8a50b9639ac19175cfd"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.538207 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jcpp8" podStartSLOduration=7.538190152 podStartE2EDuration="7.538190152s" podCreationTimestamp="2025-12-02 07:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.530980677 +0000 UTC m=+139.315059539" watchObservedRunningTime="2025-12-02 07:48:11.538190152 +0000 UTC m=+139.322269014" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.540675 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" event={"ID":"5d2fe571-0e98-42f1-8f71-5cdc773ec89e","Type":"ContainerStarted","Data":"e6ea2da107c3f79cd77161a949268cba91bdc4e6d5b15ab10ff7642ef9ef4990"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.548229 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.548545 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.048527486 +0000 UTC m=+139.832606348 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.550580 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" event={"ID":"2c561340-bdf6-4d50-85c4-10a9098e12b7","Type":"ContainerStarted","Data":"bcef3cc32bbc95b3c034aec692c8ee5024678cb71665bdfe40e8cc2ea5ac46ba"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.559972 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8x9pv" event={"ID":"39b63660-5845-41eb-94ca-ffc8ccb34413","Type":"ContainerStarted","Data":"2fe1a568279689271f7be7aeee76d5452c1282700a0ec4c703089366a0c0d389"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.561607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" event={"ID":"1eb687f4-51f6-4806-b5a0-e35639b4b019","Type":"ContainerStarted","Data":"0aafa0039d0458d2ba7f70e47511d581d0a81b16a4324e597432cae4341b1b46"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.562647 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" event={"ID":"3cf0e12e-115d-4ac5-b646-6ac98524d948","Type":"ContainerStarted","Data":"45a94b88a460ab5bf3365a6789fae38575ce6b1e7c00d442cdcc1098705e0ca1"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.568330 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" event={"ID":"483d25f5-c04d-4c89-a260-59cc01d12255","Type":"ContainerStarted","Data":"43f1314817182bfac272464fbccec04bd37b9fd7893e22c876a7abebd69e4768"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.600048 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" event={"ID":"349309ae-e421-4055-96a2-0480c5562853","Type":"ContainerStarted","Data":"078a1ca1ae83f37e0363f75a98f72387446f8c2e952167d09664f9ef71cb250f"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.600096 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" event={"ID":"349309ae-e421-4055-96a2-0480c5562853","Type":"ContainerStarted","Data":"8e7e9c261f3e63b40148aa54a346a0405b35558851c67d0a2b80d2be06111a11"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.622824 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" event={"ID":"a0f413a7-cc30-430c-a9a9-b7eb6da2916d","Type":"ContainerStarted","Data":"61f138aa194029831cd89142aaf9eaa2a6fc506fccf5b31df9874ff557b35c7e"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.636407 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" podStartSLOduration=121.636364835 podStartE2EDuration="2m1.636364835s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.635097333 +0000 UTC m=+139.419176195" watchObservedRunningTime="2025-12-02 07:48:11.636364835 +0000 UTC m=+139.420443697" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.644013 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" event={"ID":"0d511610-c204-4e20-a95e-43a7b41332b8","Type":"ContainerStarted","Data":"84341c89351afbd2867535fa08401da43e1e1c84b96a95ec392dba02cbaadbb0"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.652024 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.653814 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.153798031 +0000 UTC m=+139.937876993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.670078 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" event={"ID":"c4d13548-f8dc-4a70-a287-9bee33dd7dd4","Type":"ContainerStarted","Data":"3ccd365d5ee59aeafed5984bab6742ef592b013e9f624b6ff5bc8fa02a017749"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.676144 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" podStartSLOduration=121.676128023 podStartE2EDuration="2m1.676128023s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.673530887 +0000 UTC m=+139.457609749" watchObservedRunningTime="2025-12-02 07:48:11.676128023 +0000 UTC m=+139.460206885" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.685985 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" event={"ID":"f24830de-18e3-4204-8996-e7f1b0d45aec","Type":"ContainerStarted","Data":"7fbedcef739d40d0174e1fc0fcf08671407df10658192881b62cbad7b69d7c78"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.701985 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" event={"ID":"54581392-8f19-498b-b24d-c35064382946","Type":"ContainerStarted","Data":"147ffe03ca464f8955f656123d8fb78b3634ace8f1704c7d769f815c45396119"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.705026 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" event={"ID":"c8c8387e-bdc5-4d7a-ae05-776786ee7277","Type":"ContainerStarted","Data":"b49a21164f004c97a66e60b5b92cbb309100e2345c88885a18be0ab7d1cc0541"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.705054 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" event={"ID":"c8c8387e-bdc5-4d7a-ae05-776786ee7277","Type":"ContainerStarted","Data":"5fb728cdb6b0869dbc9b3e07b0d41b351f685fb0ceae89ba3e84d8e2400bf157"} Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.719428 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" podStartSLOduration=121.719409201 podStartE2EDuration="2m1.719409201s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.719251627 +0000 UTC m=+139.503330489" watchObservedRunningTime="2025-12-02 07:48:11.719409201 +0000 UTC m=+139.503488063" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.720603 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-4ms2c" podStartSLOduration=121.720596111 podStartE2EDuration="2m1.720596111s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.696893825 +0000 UTC m=+139.480972687" watchObservedRunningTime="2025-12-02 07:48:11.720596111 +0000 UTC m=+139.504674973" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.720905 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-nqndh" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.754787 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.755591 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.255573687 +0000 UTC m=+140.039652549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.768507 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.772228 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" podStartSLOduration=121.772210603 podStartE2EDuration="2m1.772210603s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.768915498 +0000 UTC m=+139.552994380" watchObservedRunningTime="2025-12-02 07:48:11.772210603 +0000 UTC m=+139.556289465" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.779170 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:11 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:11 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:11 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.779219 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.820769 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-54xgq"] Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.821628 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.823412 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-nkzgs" podStartSLOduration=121.823395363 podStartE2EDuration="2m1.823395363s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.822550552 +0000 UTC m=+139.606629414" watchObservedRunningTime="2025-12-02 07:48:11.823395363 +0000 UTC m=+139.607474215" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.823811 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.859451 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54xgq"] Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.859989 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.863544 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.363533011 +0000 UTC m=+140.147611873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.961251 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.968198 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-catalog-content\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.968429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54h5k\" (UniqueName: \"kubernetes.io/projected/b6c2ff9e-15cc-4fab-8039-b51552b052c0-kube-api-access-54h5k\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:11 crc kubenswrapper[4691]: I1202 07:48:11.968492 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-utilities\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:11 crc kubenswrapper[4691]: E1202 07:48:11.990974 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.490949043 +0000 UTC m=+140.275027895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.003980 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ncvcc" podStartSLOduration=122.003958836 podStartE2EDuration="2m2.003958836s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.93965714 +0000 UTC m=+139.723736002" watchObservedRunningTime="2025-12-02 07:48:12.003958836 +0000 UTC m=+139.788037698" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.005945 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-6kpnf" podStartSLOduration=122.005938177 podStartE2EDuration="2m2.005938177s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:11.998518017 +0000 UTC m=+139.782596879" watchObservedRunningTime="2025-12-02 07:48:12.005938177 +0000 UTC m=+139.790017039" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.035882 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bjm9m"] Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.036843 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.055543 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nl6hp" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.056560 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.064262 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjm9m"] Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.070436 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-catalog-content\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.070515 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.070568 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54h5k\" (UniqueName: \"kubernetes.io/projected/b6c2ff9e-15cc-4fab-8039-b51552b052c0-kube-api-access-54h5k\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.070607 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-utilities\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.071488 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-utilities\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.071805 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-catalog-content\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.072042 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.572030929 +0000 UTC m=+140.356109791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.082271 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-lb7g9" podStartSLOduration=122.08225247 podStartE2EDuration="2m2.08225247s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:12.080946597 +0000 UTC m=+139.865025459" watchObservedRunningTime="2025-12-02 07:48:12.08225247 +0000 UTC m=+139.866331332" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.149927 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54h5k\" (UniqueName: \"kubernetes.io/projected/b6c2ff9e-15cc-4fab-8039-b51552b052c0-kube-api-access-54h5k\") pod \"community-operators-54xgq\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.173422 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.173669 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-utilities\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.173706 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-catalog-content\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.173803 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdgjb\" (UniqueName: \"kubernetes.io/projected/1ce6e92a-6368-4f1a-8926-88a28ff76460-kube-api-access-vdgjb\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.173954 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.673938757 +0000 UTC m=+140.458017619 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.202511 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hd92g" podStartSLOduration=122.202496529 podStartE2EDuration="2m2.202496529s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:12.199160793 +0000 UTC m=+139.983239655" watchObservedRunningTime="2025-12-02 07:48:12.202496529 +0000 UTC m=+139.986575391" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.202677 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hffd" podStartSLOduration=122.202671663 podStartE2EDuration="2m2.202671663s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:12.167619436 +0000 UTC m=+139.951698298" watchObservedRunningTime="2025-12-02 07:48:12.202671663 +0000 UTC m=+139.986750525" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.231048 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmd9f"] Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.232057 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.253482 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-7pvc7" podStartSLOduration=122.253465044 podStartE2EDuration="2m2.253465044s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:12.24746832 +0000 UTC m=+140.031547182" watchObservedRunningTime="2025-12-02 07:48:12.253465044 +0000 UTC m=+140.037543906" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.263089 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmd9f"] Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.276442 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdgjb\" (UniqueName: \"kubernetes.io/projected/1ce6e92a-6368-4f1a-8926-88a28ff76460-kube-api-access-vdgjb\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.276493 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.276561 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-utilities\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.276580 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-catalog-content\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.277019 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-catalog-content\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.277447 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.777437557 +0000 UTC m=+140.561516409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.277804 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-utilities\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.313317 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-tl75k" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.318970 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdgjb\" (UniqueName: \"kubernetes.io/projected/1ce6e92a-6368-4f1a-8926-88a28ff76460-kube-api-access-vdgjb\") pod \"certified-operators-bjm9m\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.378256 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.378530 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrkmq\" (UniqueName: \"kubernetes.io/projected/aa15a612-ba38-4138-86e2-9840652f1724-kube-api-access-zrkmq\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.378560 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-catalog-content\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.378581 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-utilities\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.378689 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.878675659 +0000 UTC m=+140.662754521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.380906 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.424500 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-54mc7"] Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.425607 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.447564 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.459281 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-b28bc" podStartSLOduration=122.459260182 podStartE2EDuration="2m2.459260182s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:12.458354529 +0000 UTC m=+140.242433391" watchObservedRunningTime="2025-12-02 07:48:12.459260182 +0000 UTC m=+140.243339044" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.476340 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54mc7"] Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.483206 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-utilities\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.483323 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.483349 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrkmq\" (UniqueName: \"kubernetes.io/projected/aa15a612-ba38-4138-86e2-9840652f1724-kube-api-access-zrkmq\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.483368 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-catalog-content\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.483775 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-catalog-content\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.483978 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-utilities\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.484255 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:12.984244002 +0000 UTC m=+140.768322864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.533498 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrkmq\" (UniqueName: \"kubernetes.io/projected/aa15a612-ba38-4138-86e2-9840652f1724-kube-api-access-zrkmq\") pod \"community-operators-tmd9f\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.587508 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.587669 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-utilities\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.587730 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvkvq\" (UniqueName: \"kubernetes.io/projected/ce744080-92af-4598-9f69-23dae48f82b5-kube-api-access-gvkvq\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.587798 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-catalog-content\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.587924 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.087910586 +0000 UTC m=+140.871989438 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.607961 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.693619 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-utilities\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.693705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvkvq\" (UniqueName: \"kubernetes.io/projected/ce744080-92af-4598-9f69-23dae48f82b5-kube-api-access-gvkvq\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.693776 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-catalog-content\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.693813 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.694118 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.194105995 +0000 UTC m=+140.978184857 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.695050 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-utilities\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.695563 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-catalog-content\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.751089 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvkvq\" (UniqueName: \"kubernetes.io/projected/ce744080-92af-4598-9f69-23dae48f82b5-kube-api-access-gvkvq\") pod \"certified-operators-54mc7\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.764998 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:12 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:12 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:12 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.765057 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.771476 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" event={"ID":"c4d13548-f8dc-4a70-a287-9bee33dd7dd4","Type":"ContainerStarted","Data":"58c2765c627f984b339c616a4ed7c581abb3ecdf53ce39be8cf020643833c53b"} Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.799850 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.800166 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.300142979 +0000 UTC m=+141.084221841 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.800257 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" event={"ID":"1eb687f4-51f6-4806-b5a0-e35639b4b019","Type":"ContainerStarted","Data":"8c770b7a5f5193a06089504d7ceacf955690cc6e63d6a39b7fe5a97cefe5f304"} Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.801209 4691 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-sv92r container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.801259 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.859328 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:48:12 crc kubenswrapper[4691]: I1202 07:48:12.917643 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:12 crc kubenswrapper[4691]: E1202 07:48:12.923804 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.423783505 +0000 UTC m=+141.207862367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.020362 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.020698 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.520680965 +0000 UTC m=+141.304759827 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.020958 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.021243 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.52123565 +0000 UTC m=+141.305314512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.122846 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.123293 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.623275852 +0000 UTC m=+141.407354714 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.224497 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.224790 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.724779001 +0000 UTC m=+141.508857863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.328285 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.328574 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.828552657 +0000 UTC m=+141.612631519 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.430448 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.431075 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:13.931063892 +0000 UTC m=+141.715142754 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.445204 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-54xgq"] Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.532344 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.532739 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.032724644 +0000 UTC m=+141.816803496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.553998 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bjm9m"] Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.638488 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.638775 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.138748839 +0000 UTC m=+141.922827701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.741439 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.742098 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.242082574 +0000 UTC m=+142.026161436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.776966 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:13 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:13 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:13 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.777039 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.831718 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54xgq" event={"ID":"b6c2ff9e-15cc-4fab-8039-b51552b052c0","Type":"ContainerStarted","Data":"02e8895f04712c2534092602cd9f0342aa4177b4482010bb038dc2a57d77bb0b"} Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.843649 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" event={"ID":"1eb687f4-51f6-4806-b5a0-e35639b4b019","Type":"ContainerStarted","Data":"563145c2fd3d8fea158a73afbae1b6c56905eea94e3687532283138dccd85f72"} Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.844615 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjm9m" event={"ID":"1ce6e92a-6368-4f1a-8926-88a28ff76460","Type":"ContainerStarted","Data":"ffaaa3a5a715cf02e4244df4ea7296012780a895efe13c535c80324509f3a0d2"} Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.844753 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.845023 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.345008369 +0000 UTC m=+142.129087231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.845856 4691 generic.go:334] "Generic (PLEG): container finished" podID="943a92a5-ba00-456a-83f4-c383e252288a" containerID="15ce1a00fdad12a14dc02d5b96b66ab5ee6272b87015b8a50b9639ac19175cfd" exitCode=0 Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.846287 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" event={"ID":"943a92a5-ba00-456a-83f4-c383e252288a","Type":"ContainerDied","Data":"15ce1a00fdad12a14dc02d5b96b66ab5ee6272b87015b8a50b9639ac19175cfd"} Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.853310 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmd9f"] Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.908312 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5sc9f"] Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.911337 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.912439 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-54mc7"] Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.924394 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.961354 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:13 crc kubenswrapper[4691]: E1202 07:48:13.975276 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.475254004 +0000 UTC m=+142.259332866 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:13 crc kubenswrapper[4691]: I1202 07:48:13.985894 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sc9f"] Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.065598 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-catalog-content\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.065645 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-utilities\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.065680 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s6xq\" (UniqueName: \"kubernetes.io/projected/81d25a09-ad32-4de2-860e-250010e610cb-kube-api-access-4s6xq\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.065712 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.069302 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.569281471 +0000 UTC m=+142.353360333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.134266 4691 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.167124 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.167504 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4s6xq\" (UniqueName: \"kubernetes.io/projected/81d25a09-ad32-4de2-860e-250010e610cb-kube-api-access-4s6xq\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.167619 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-catalog-content\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.167667 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-utilities\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.168218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-utilities\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.168362 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.668338717 +0000 UTC m=+142.452417599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.168534 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-catalog-content\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.193331 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s6xq\" (UniqueName: \"kubernetes.io/projected/81d25a09-ad32-4de2-860e-250010e610cb-kube-api-access-4s6xq\") pod \"redhat-marketplace-5sc9f\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.204965 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2cxnn"] Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.211042 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.231957 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2cxnn"] Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.269469 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.269857 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.769842115 +0000 UTC m=+142.553920977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.293142 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.370399 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.370566 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-utilities\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.370604 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-catalog-content\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.370678 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8j7dq\" (UniqueName: \"kubernetes.io/projected/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-kube-api-access-8j7dq\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.370834 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.870820421 +0000 UTC m=+142.654899283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.472551 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8j7dq\" (UniqueName: \"kubernetes.io/projected/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-kube-api-access-8j7dq\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.472643 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-utilities\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.472670 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-catalog-content\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.472711 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.473087 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:14.973073619 +0000 UTC m=+142.757152491 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.473936 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-utilities\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.474217 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-catalog-content\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.508059 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8j7dq\" (UniqueName: \"kubernetes.io/projected/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-kube-api-access-8j7dq\") pod \"redhat-marketplace-2cxnn\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.548800 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.575360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.575551 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:15.075521561 +0000 UTC m=+142.859600433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.575591 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.575958 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:15.075942832 +0000 UTC m=+142.860021694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.679041 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.679362 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-02 07:48:15.179331829 +0000 UTC m=+142.963410691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.679644 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: E1202 07:48:14.679934 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-02 07:48:15.179921594 +0000 UTC m=+142.964000456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kq8jr" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.729706 4691 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-02T07:48:14.134310976Z","Handler":null,"Name":""} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.737893 4691 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.737927 4691 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.769687 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:14 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:14 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:14 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.769728 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.780510 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.832857 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.860739 4691 generic.go:334] "Generic (PLEG): container finished" podID="ce744080-92af-4598-9f69-23dae48f82b5" containerID="074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7" exitCode=0 Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.860872 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerDied","Data":"074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.860906 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerStarted","Data":"35cc602df2a7fd8be7fa823ea869d832002a544a8fe5c4aef6df000080c1ebc1"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.862842 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.863220 4691 generic.go:334] "Generic (PLEG): container finished" podID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerID="f56bae39991695327afadbc7b0f3fd3ae378d0d75eb256f89ebb6da23e80a2f0" exitCode=0 Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.863568 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54xgq" event={"ID":"b6c2ff9e-15cc-4fab-8039-b51552b052c0","Type":"ContainerDied","Data":"f56bae39991695327afadbc7b0f3fd3ae378d0d75eb256f89ebb6da23e80a2f0"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.866558 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" event={"ID":"1eb687f4-51f6-4806-b5a0-e35639b4b019","Type":"ContainerStarted","Data":"f3134cb9e466b9f6fcc275ab3b4eff22fac8b460dec0189c5060d403801e572d"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.870283 4691 generic.go:334] "Generic (PLEG): container finished" podID="aa15a612-ba38-4138-86e2-9840652f1724" containerID="646f02725a6d68510243ecb1c33e4bc7bf44904652f9b2142ab845f5dff89e14" exitCode=0 Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.870333 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmd9f" event={"ID":"aa15a612-ba38-4138-86e2-9840652f1724","Type":"ContainerDied","Data":"646f02725a6d68510243ecb1c33e4bc7bf44904652f9b2142ab845f5dff89e14"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.870351 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmd9f" event={"ID":"aa15a612-ba38-4138-86e2-9840652f1724","Type":"ContainerStarted","Data":"26822aeca534992ba11ddfda8209edb0b1a5a8b0c18c2bd257084ea49168d634"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.873666 4691 generic.go:334] "Generic (PLEG): container finished" podID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerID="f0b5e3a5e90e9ee1b3de41394161451e0bf577e8a4e6067c6c73c2f5b2d619c9" exitCode=0 Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.874550 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjm9m" event={"ID":"1ce6e92a-6368-4f1a-8926-88a28ff76460","Type":"ContainerDied","Data":"f0b5e3a5e90e9ee1b3de41394161451e0bf577e8a4e6067c6c73c2f5b2d619c9"} Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.883038 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.892674 4691 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.892708 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.946244 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kq8jr\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.951958 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sc9f"] Dec 02 07:48:14 crc kubenswrapper[4691]: W1202 07:48:14.974500 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d25a09_ad32_4de2_860e_250010e610cb.slice/crio-656c308f86c07359a948683d0723b26f2afb31374b8e352dd6a8d8041a3f670e WatchSource:0}: Error finding container 656c308f86c07359a948683d0723b26f2afb31374b8e352dd6a8d8041a3f670e: Status 404 returned error can't find the container with id 656c308f86c07359a948683d0723b26f2afb31374b8e352dd6a8d8041a3f670e Dec 02 07:48:14 crc kubenswrapper[4691]: I1202 07:48:14.997313 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" podStartSLOduration=10.997283019 podStartE2EDuration="10.997283019s" podCreationTimestamp="2025-12-02 07:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:14.982341667 +0000 UTC m=+142.766420529" watchObservedRunningTime="2025-12-02 07:48:14.997283019 +0000 UTC m=+142.781361881" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.002114 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2cxnn"] Dec 02 07:48:15 crc kubenswrapper[4691]: W1202 07:48:15.008184 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2fd2d03_d859_4c45_b7fa_b9af4816e1cf.slice/crio-f7c58e2eab913fd7c50cc10e5d44057449acd99dfec65be6a1311659857093b0 WatchSource:0}: Error finding container f7c58e2eab913fd7c50cc10e5d44057449acd99dfec65be6a1311659857093b0: Status 404 returned error can't find the container with id f7c58e2eab913fd7c50cc10e5d44057449acd99dfec65be6a1311659857093b0 Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.095445 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.096097 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.142268 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.206199 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jmzz7"] Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.207542 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.211243 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.216935 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jmzz7"] Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.293267 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-utilities\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.293633 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs7wc\" (UniqueName: \"kubernetes.io/projected/47426780-5ffb-47da-8a00-fb96b6a6099a-kube-api-access-rs7wc\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.293705 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-catalog-content\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.313985 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.394726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86sks\" (UniqueName: \"kubernetes.io/projected/943a92a5-ba00-456a-83f4-c383e252288a-kube-api-access-86sks\") pod \"943a92a5-ba00-456a-83f4-c383e252288a\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.394818 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/943a92a5-ba00-456a-83f4-c383e252288a-config-volume\") pod \"943a92a5-ba00-456a-83f4-c383e252288a\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.394878 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/943a92a5-ba00-456a-83f4-c383e252288a-secret-volume\") pod \"943a92a5-ba00-456a-83f4-c383e252288a\" (UID: \"943a92a5-ba00-456a-83f4-c383e252288a\") " Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.396109 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-utilities\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.396205 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs7wc\" (UniqueName: \"kubernetes.io/projected/47426780-5ffb-47da-8a00-fb96b6a6099a-kube-api-access-rs7wc\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.396669 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-catalog-content\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.396798 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-utilities\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.397043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-catalog-content\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.397743 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/943a92a5-ba00-456a-83f4-c383e252288a-config-volume" (OuterVolumeSpecName: "config-volume") pod "943a92a5-ba00-456a-83f4-c383e252288a" (UID: "943a92a5-ba00-456a-83f4-c383e252288a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.401045 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/943a92a5-ba00-456a-83f4-c383e252288a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "943a92a5-ba00-456a-83f4-c383e252288a" (UID: "943a92a5-ba00-456a-83f4-c383e252288a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.401088 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/943a92a5-ba00-456a-83f4-c383e252288a-kube-api-access-86sks" (OuterVolumeSpecName: "kube-api-access-86sks") pod "943a92a5-ba00-456a-83f4-c383e252288a" (UID: "943a92a5-ba00-456a-83f4-c383e252288a"). InnerVolumeSpecName "kube-api-access-86sks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.410723 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs7wc\" (UniqueName: \"kubernetes.io/projected/47426780-5ffb-47da-8a00-fb96b6a6099a-kube-api-access-rs7wc\") pod \"redhat-operators-jmzz7\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.450016 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kq8jr"] Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.498402 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86sks\" (UniqueName: \"kubernetes.io/projected/943a92a5-ba00-456a-83f4-c383e252288a-kube-api-access-86sks\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.498463 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/943a92a5-ba00-456a-83f4-c383e252288a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.498477 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/943a92a5-ba00-456a-83f4-c383e252288a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.596214 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nd4nh"] Dec 02 07:48:15 crc kubenswrapper[4691]: E1202 07:48:15.596491 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="943a92a5-ba00-456a-83f4-c383e252288a" containerName="collect-profiles" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.596510 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="943a92a5-ba00-456a-83f4-c383e252288a" containerName="collect-profiles" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.596671 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="943a92a5-ba00-456a-83f4-c383e252288a" containerName="collect-profiles" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.597606 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.609362 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nd4nh"] Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.612443 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.701467 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-catalog-content\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.701512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-utilities\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.701573 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfj7f\" (UniqueName: \"kubernetes.io/projected/103dafaf-ac5c-49e6-86c1-d604072e0ccc-kube-api-access-xfj7f\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.769954 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:15 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:15 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:15 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.770218 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.802680 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-catalog-content\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.802782 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-utilities\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.802826 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfj7f\" (UniqueName: \"kubernetes.io/projected/103dafaf-ac5c-49e6-86c1-d604072e0ccc-kube-api-access-xfj7f\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.803353 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-catalog-content\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.803566 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-utilities\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.829653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfj7f\" (UniqueName: \"kubernetes.io/projected/103dafaf-ac5c-49e6-86c1-d604072e0ccc-kube-api-access-xfj7f\") pod \"redhat-operators-nd4nh\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.885598 4691 generic.go:334] "Generic (PLEG): container finished" podID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerID="e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e" exitCode=0 Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.885708 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2cxnn" event={"ID":"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf","Type":"ContainerDied","Data":"e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.885859 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2cxnn" event={"ID":"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf","Type":"ContainerStarted","Data":"f7c58e2eab913fd7c50cc10e5d44057449acd99dfec65be6a1311659857093b0"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.888244 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" event={"ID":"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec","Type":"ContainerStarted","Data":"bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.888292 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" event={"ID":"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec","Type":"ContainerStarted","Data":"c04ed303f172948be9c4b6f940c935baac2667029a854ad418190686c4a95b97"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.888411 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.893189 4691 generic.go:334] "Generic (PLEG): container finished" podID="81d25a09-ad32-4de2-860e-250010e610cb" containerID="f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48" exitCode=0 Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.893237 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sc9f" event={"ID":"81d25a09-ad32-4de2-860e-250010e610cb","Type":"ContainerDied","Data":"f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.893258 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sc9f" event={"ID":"81d25a09-ad32-4de2-860e-250010e610cb","Type":"ContainerStarted","Data":"656c308f86c07359a948683d0723b26f2afb31374b8e352dd6a8d8041a3f670e"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.895715 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.898338 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j" event={"ID":"943a92a5-ba00-456a-83f4-c383e252288a","Type":"ContainerDied","Data":"c0c1dccf35ef668a603436db4ffac5c5d58e50d6d5a427dd5a0c8554261e43e8"} Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.898364 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0c1dccf35ef668a603436db4ffac5c5d58e50d6d5a427dd5a0c8554261e43e8" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.935648 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.946003 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" podStartSLOduration=125.945983126 podStartE2EDuration="2m5.945983126s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:15.943749408 +0000 UTC m=+143.727828290" watchObservedRunningTime="2025-12-02 07:48:15.945983126 +0000 UTC m=+143.730061988" Dec 02 07:48:15 crc kubenswrapper[4691]: I1202 07:48:15.960517 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jmzz7"] Dec 02 07:48:15 crc kubenswrapper[4691]: W1202 07:48:15.975859 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47426780_5ffb_47da_8a00_fb96b6a6099a.slice/crio-04cb794ddda0de9b9bae776211c7e73b79ba3cbb0ce892ad72c315da1d649a16 WatchSource:0}: Error finding container 04cb794ddda0de9b9bae776211c7e73b79ba3cbb0ce892ad72c315da1d649a16: Status 404 returned error can't find the container with id 04cb794ddda0de9b9bae776211c7e73b79ba3cbb0ce892ad72c315da1d649a16 Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.224741 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-jsqnn container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.224818 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-jsqnn container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" start-of-body= Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.225060 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jsqnn" podUID="db7d7a68-80e4-4e9f-a7b4-adcc14282d4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.225103 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jsqnn" podUID="db7d7a68-80e4-4e9f-a7b4-adcc14282d4d" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.11:8080/\": dial tcp 10.217.0.11:8080: connect: connection refused" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.353395 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nd4nh"] Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.442695 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.442787 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.456619 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.595279 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.630609 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tcd8d" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.758731 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.775455 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:16 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:16 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:16 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.775510 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.808042 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.809304 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.811294 4691 patch_prober.go:28] interesting pod/console-f9d7485db-wx6m2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.811375 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-wx6m2" podUID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" containerName="console" probeResult="failure" output="Get \"https://10.217.0.22:8443/health\": dial tcp 10.217.0.22:8443: connect: connection refused" Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.951270 4691 generic.go:334] "Generic (PLEG): container finished" podID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerID="6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2" exitCode=0 Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.951597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerDied","Data":"6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2"} Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.951701 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerStarted","Data":"04cb794ddda0de9b9bae776211c7e73b79ba3cbb0ce892ad72c315da1d649a16"} Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.970802 4691 generic.go:334] "Generic (PLEG): container finished" podID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerID="2a8a0f8e5fb6b43795134c109078a61538b3008a29a32e2a032579828277d6e4" exitCode=0 Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.973043 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerDied","Data":"2a8a0f8e5fb6b43795134c109078a61538b3008a29a32e2a032579828277d6e4"} Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.973067 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerStarted","Data":"3ded147b90fe056dca8a47d094d9f353ff0779f60febcbbd028ee8207a1ffcbc"} Dec 02 07:48:16 crc kubenswrapper[4691]: I1202 07:48:16.997190 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ng6l4" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.105584 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.292787 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.293676 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.295825 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.296030 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.304994 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.354781 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.354952 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.456647 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.456732 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.456844 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.488016 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.643139 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.762065 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:17 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:17 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:17 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:17 crc kubenswrapper[4691]: I1202 07:48:17.762118 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.182867 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 02 07:48:18 crc kubenswrapper[4691]: W1202 07:48:18.221928 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd0750d56_8f83_4dbc_8c7b_8e3f0f78ca96.slice/crio-ac6a48ed1809e9ddc8decfe1cbe47815d7e790d15710b4303e3d1a67d49364e3 WatchSource:0}: Error finding container ac6a48ed1809e9ddc8decfe1cbe47815d7e790d15710b4303e3d1a67d49364e3: Status 404 returned error can't find the container with id ac6a48ed1809e9ddc8decfe1cbe47815d7e790d15710b4303e3d1a67d49364e3 Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.479599 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.479647 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.479685 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.482108 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.486400 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.490458 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.580600 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.595386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.680607 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.693026 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.700792 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.762441 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:18 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:18 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:18 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:18 crc kubenswrapper[4691]: I1202 07:48:18.762546 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:19 crc kubenswrapper[4691]: I1202 07:48:18.998172 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96","Type":"ContainerStarted","Data":"ac6a48ed1809e9ddc8decfe1cbe47815d7e790d15710b4303e3d1a67d49364e3"} Dec 02 07:48:19 crc kubenswrapper[4691]: W1202 07:48:19.305640 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-013f12d1c723ac67b201e6e11e0270def5a71cef1bad76180fc939f9d78dac68 WatchSource:0}: Error finding container 013f12d1c723ac67b201e6e11e0270def5a71cef1bad76180fc939f9d78dac68: Status 404 returned error can't find the container with id 013f12d1c723ac67b201e6e11e0270def5a71cef1bad76180fc939f9d78dac68 Dec 02 07:48:19 crc kubenswrapper[4691]: W1202 07:48:19.722402 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-cfcec6be0ddef77e92a0fe52651c507778f95fdbaec96310a94733be94a20ff8 WatchSource:0}: Error finding container cfcec6be0ddef77e92a0fe52651c507778f95fdbaec96310a94733be94a20ff8: Status 404 returned error can't find the container with id cfcec6be0ddef77e92a0fe52651c507778f95fdbaec96310a94733be94a20ff8 Dec 02 07:48:19 crc kubenswrapper[4691]: I1202 07:48:19.766341 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:19 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:19 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:19 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:19 crc kubenswrapper[4691]: I1202 07:48:19.766406 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.032961 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e4c2c7e7cec31a29e7c09f8a0016c8c62fb9a0e7d7fe4d13daf35a85f17a7a0a"} Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.076331 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96","Type":"ContainerStarted","Data":"29c42cbaafa3c7a2806957b8a3ec02dfa795cb285e6467e318d2d4b77e7d26fe"} Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.083028 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"013f12d1c723ac67b201e6e11e0270def5a71cef1bad76180fc939f9d78dac68"} Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.084929 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cfcec6be0ddef77e92a0fe52651c507778f95fdbaec96310a94733be94a20ff8"} Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.099719 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.099696285 podStartE2EDuration="3.099696285s" podCreationTimestamp="2025-12-02 07:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:20.091046124 +0000 UTC m=+147.875124996" watchObservedRunningTime="2025-12-02 07:48:20.099696285 +0000 UTC m=+147.883775157" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.652672 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.653614 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.656415 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.657804 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.659882 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.716806 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.716929 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.778817 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:20 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:20 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:20 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.778915 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.818634 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.818707 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.818880 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.860110 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:20 crc kubenswrapper[4691]: I1202 07:48:20.988377 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.186414 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"464e3da2e1e19ac07619b406b85cb9ce65434bd6e20f8819f3681b26d79eee17"} Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.204683 4691 generic.go:334] "Generic (PLEG): container finished" podID="d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96" containerID="29c42cbaafa3c7a2806957b8a3ec02dfa795cb285e6467e318d2d4b77e7d26fe" exitCode=0 Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.204844 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96","Type":"ContainerDied","Data":"29c42cbaafa3c7a2806957b8a3ec02dfa795cb285e6467e318d2d4b77e7d26fe"} Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.228504 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3223b671fbf75144bc542497f24b5132cb6742019de314fdd7efdc3734dce7ff"} Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.241008 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9a69b775553a3aabf693d6ebdb57b618dc1f69615c2d057d138554fadbbad4e4"} Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.241143 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.718507 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.761777 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:21 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:21 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:21 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.761827 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.899810 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:48:21 crc kubenswrapper[4691]: I1202 07:48:21.899946 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:48:22 crc kubenswrapper[4691]: I1202 07:48:22.237698 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jcpp8" Dec 02 07:48:22 crc kubenswrapper[4691]: I1202 07:48:22.282171 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd","Type":"ContainerStarted","Data":"e5025d8f34b7f1249c77c8e3f9f0596fdf46899df8eacc37300d184a6c50e4bc"} Dec 02 07:48:22 crc kubenswrapper[4691]: I1202 07:48:22.764918 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:22 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:22 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:22 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:22 crc kubenswrapper[4691]: I1202 07:48:22.765081 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:22 crc kubenswrapper[4691]: I1202 07:48:22.886768 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.013428 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kube-api-access\") pod \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.013718 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kubelet-dir\") pod \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\" (UID: \"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96\") " Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.014133 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96" (UID: "d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.026110 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96" (UID: "d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.115234 4691 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.115291 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.303707 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96","Type":"ContainerDied","Data":"ac6a48ed1809e9ddc8decfe1cbe47815d7e790d15710b4303e3d1a67d49364e3"} Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.303773 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac6a48ed1809e9ddc8decfe1cbe47815d7e790d15710b4303e3d1a67d49364e3" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.303842 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.323181 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd","Type":"ContainerStarted","Data":"1464ebcb70df807b40f991c8951907ea79c0da8b92f5abaf82e927cf799b8ac2"} Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.350818 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.350788406 podStartE2EDuration="3.350788406s" podCreationTimestamp="2025-12-02 07:48:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:23.345967393 +0000 UTC m=+151.130046255" watchObservedRunningTime="2025-12-02 07:48:23.350788406 +0000 UTC m=+151.134867268" Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.764450 4691 patch_prober.go:28] interesting pod/router-default-5444994796-h7dkw container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 02 07:48:23 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Dec 02 07:48:23 crc kubenswrapper[4691]: [+]process-running ok Dec 02 07:48:23 crc kubenswrapper[4691]: healthz check failed Dec 02 07:48:23 crc kubenswrapper[4691]: I1202 07:48:23.764718 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-h7dkw" podUID="46a270d2-43de-41d0-bb1b-dc02b1a28d3a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 07:48:24 crc kubenswrapper[4691]: I1202 07:48:24.354896 4691 generic.go:334] "Generic (PLEG): container finished" podID="0c7cbf63-1ec7-408d-ba91-8b94b4a811fd" containerID="1464ebcb70df807b40f991c8951907ea79c0da8b92f5abaf82e927cf799b8ac2" exitCode=0 Dec 02 07:48:24 crc kubenswrapper[4691]: I1202 07:48:24.354940 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd","Type":"ContainerDied","Data":"1464ebcb70df807b40f991c8951907ea79c0da8b92f5abaf82e927cf799b8ac2"} Dec 02 07:48:24 crc kubenswrapper[4691]: I1202 07:48:24.762170 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:24 crc kubenswrapper[4691]: I1202 07:48:24.773336 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-h7dkw" Dec 02 07:48:26 crc kubenswrapper[4691]: I1202 07:48:26.304199 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jsqnn" Dec 02 07:48:26 crc kubenswrapper[4691]: I1202 07:48:26.942826 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:26 crc kubenswrapper[4691]: I1202 07:48:26.946447 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:48:32 crc kubenswrapper[4691]: I1202 07:48:32.154680 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:48:32 crc kubenswrapper[4691]: I1202 07:48:32.171512 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b30f2d1f-53a1-4e87-819d-1e20bf3ed92a-metrics-certs\") pod \"network-metrics-daemon-8lqps\" (UID: \"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a\") " pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:48:32 crc kubenswrapper[4691]: I1202 07:48:32.196988 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8lqps" Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.588875 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.701337 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kube-api-access\") pod \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.701409 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kubelet-dir\") pod \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\" (UID: \"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd\") " Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.701642 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0c7cbf63-1ec7-408d-ba91-8b94b4a811fd" (UID: "0c7cbf63-1ec7-408d-ba91-8b94b4a811fd"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.701934 4691 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.710566 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0c7cbf63-1ec7-408d-ba91-8b94b4a811fd" (UID: "0c7cbf63-1ec7-408d-ba91-8b94b4a811fd"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:34 crc kubenswrapper[4691]: I1202 07:48:34.803039 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0c7cbf63-1ec7-408d-ba91-8b94b4a811fd-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:35 crc kubenswrapper[4691]: I1202 07:48:35.102421 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:48:35 crc kubenswrapper[4691]: I1202 07:48:35.482413 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0c7cbf63-1ec7-408d-ba91-8b94b4a811fd","Type":"ContainerDied","Data":"e5025d8f34b7f1249c77c8e3f9f0596fdf46899df8eacc37300d184a6c50e4bc"} Dec 02 07:48:35 crc kubenswrapper[4691]: I1202 07:48:35.482447 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5025d8f34b7f1249c77c8e3f9f0596fdf46899df8eacc37300d184a6c50e4bc" Dec 02 07:48:35 crc kubenswrapper[4691]: I1202 07:48:35.482534 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 02 07:48:47 crc kubenswrapper[4691]: I1202 07:48:47.056158 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-5qs4n" Dec 02 07:48:48 crc kubenswrapper[4691]: E1202 07:48:48.753373 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 07:48:48 crc kubenswrapper[4691]: E1202 07:48:48.753620 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvkvq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-54mc7_openshift-marketplace(ce744080-92af-4598-9f69-23dae48f82b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 07:48:48 crc kubenswrapper[4691]: E1202 07:48:48.755194 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-54mc7" podUID="ce744080-92af-4598-9f69-23dae48f82b5" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.626161 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-54mc7" podUID="ce744080-92af-4598-9f69-23dae48f82b5" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.687721 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.688199 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4s6xq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-5sc9f_openshift-marketplace(81d25a09-ad32-4de2-860e-250010e610cb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.689366 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-5sc9f" podUID="81d25a09-ad32-4de2-860e-250010e610cb" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.715732 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.715935 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8j7dq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-2cxnn_openshift-marketplace(d2fd2d03-d859-4c45-b7fa-b9af4816e1cf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 07:48:49 crc kubenswrapper[4691]: E1202 07:48:49.717115 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-2cxnn" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" Dec 02 07:48:51 crc kubenswrapper[4691]: I1202 07:48:51.899213 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:48:51 crc kubenswrapper[4691]: I1202 07:48:51.899586 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:48:52 crc kubenswrapper[4691]: E1202 07:48:52.767459 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-2cxnn" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" Dec 02 07:48:52 crc kubenswrapper[4691]: E1202 07:48:52.768115 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-5sc9f" podUID="81d25a09-ad32-4de2-860e-250010e610cb" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.657961 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.658678 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrkmq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tmd9f_openshift-marketplace(aa15a612-ba38-4138-86e2-9840652f1724): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.662327 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tmd9f" podUID="aa15a612-ba38-4138-86e2-9840652f1724" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.706943 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.707121 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54h5k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-54xgq_openshift-marketplace(b6c2ff9e-15cc-4fab-8039-b51552b052c0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.708519 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-54xgq" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.726472 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.726605 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vdgjb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bjm9m_openshift-marketplace(1ce6e92a-6368-4f1a-8926-88a28ff76460): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.727727 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bjm9m" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.847085 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.860075 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c7cbf63-1ec7-408d-ba91-8b94b4a811fd" containerName="pruner" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.860127 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c7cbf63-1ec7-408d-ba91-8b94b4a811fd" containerName="pruner" Dec 02 07:48:54 crc kubenswrapper[4691]: E1202 07:48:54.860177 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96" containerName="pruner" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.860189 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96" containerName="pruner" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.860407 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c7cbf63-1ec7-408d-ba91-8b94b4a811fd" containerName="pruner" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.860426 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0750d56-8f83-4dbc-8c7b-8e3f0f78ca96" containerName="pruner" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.861004 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.861126 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.868225 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.868595 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.893506 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180794db-347f-4628-9e7d-c7e80dc8378f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.893572 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180794db-347f-4628-9e7d-c7e80dc8378f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.994329 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180794db-347f-4628-9e7d-c7e80dc8378f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.994442 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180794db-347f-4628-9e7d-c7e80dc8378f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:54 crc kubenswrapper[4691]: I1202 07:48:54.994508 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180794db-347f-4628-9e7d-c7e80dc8378f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.030753 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180794db-347f-4628-9e7d-c7e80dc8378f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.074339 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8lqps"] Dec 02 07:48:55 crc kubenswrapper[4691]: W1202 07:48:55.080056 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30f2d1f_53a1_4e87_819d_1e20bf3ed92a.slice/crio-1be43f6229f30cb5fad097f7377d65c6ff3726541475541f43b1b0266c261300 WatchSource:0}: Error finding container 1be43f6229f30cb5fad097f7377d65c6ff3726541475541f43b1b0266c261300: Status 404 returned error can't find the container with id 1be43f6229f30cb5fad097f7377d65c6ff3726541475541f43b1b0266c261300 Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.191792 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.620741 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerStarted","Data":"79d8f7137927f5f05485d7ce686cec2aa4afb6810e9bacdf99269a956e85e317"} Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.622865 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerStarted","Data":"63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6"} Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.624536 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8lqps" event={"ID":"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a","Type":"ContainerStarted","Data":"af2ac564eb76dab8d45789c9fc9a7e1e38f83810e2ab2ea9c82fe06d1c1cbe99"} Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.624578 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8lqps" event={"ID":"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a","Type":"ContainerStarted","Data":"1be43f6229f30cb5fad097f7377d65c6ff3726541475541f43b1b0266c261300"} Dec 02 07:48:55 crc kubenswrapper[4691]: E1202 07:48:55.625978 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tmd9f" podUID="aa15a612-ba38-4138-86e2-9840652f1724" Dec 02 07:48:55 crc kubenswrapper[4691]: E1202 07:48:55.626165 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bjm9m" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" Dec 02 07:48:55 crc kubenswrapper[4691]: E1202 07:48:55.626220 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-54xgq" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" Dec 02 07:48:55 crc kubenswrapper[4691]: I1202 07:48:55.690466 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 02 07:48:55 crc kubenswrapper[4691]: W1202 07:48:55.703295 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod180794db_347f_4628_9e7d_c7e80dc8378f.slice/crio-764a0eff34b9139ffd2cd66b04bac8c2ebf4d9317b64e8aced47f3e1ca7cd41a WatchSource:0}: Error finding container 764a0eff34b9139ffd2cd66b04bac8c2ebf4d9317b64e8aced47f3e1ca7cd41a: Status 404 returned error can't find the container with id 764a0eff34b9139ffd2cd66b04bac8c2ebf4d9317b64e8aced47f3e1ca7cd41a Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.632856 4691 generic.go:334] "Generic (PLEG): container finished" podID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerID="63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6" exitCode=0 Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.632938 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerDied","Data":"63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6"} Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.638862 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8lqps" event={"ID":"b30f2d1f-53a1-4e87-819d-1e20bf3ed92a","Type":"ContainerStarted","Data":"248919b5d0f65a806d4f437fa29c90680497bd621b1cf5c936d83be5ddbfe8d1"} Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.640931 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180794db-347f-4628-9e7d-c7e80dc8378f","Type":"ContainerStarted","Data":"c6fcdb9358a56b8fef1d84897811b41767a485185d1ad9e205715363d0af7224"} Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.640968 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180794db-347f-4628-9e7d-c7e80dc8378f","Type":"ContainerStarted","Data":"764a0eff34b9139ffd2cd66b04bac8c2ebf4d9317b64e8aced47f3e1ca7cd41a"} Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.643037 4691 generic.go:334] "Generic (PLEG): container finished" podID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerID="79d8f7137927f5f05485d7ce686cec2aa4afb6810e9bacdf99269a956e85e317" exitCode=0 Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.643083 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerDied","Data":"79d8f7137927f5f05485d7ce686cec2aa4afb6810e9bacdf99269a956e85e317"} Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.664009 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.663991128 podStartE2EDuration="2.663991128s" podCreationTimestamp="2025-12-02 07:48:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:56.662099149 +0000 UTC m=+184.446178021" watchObservedRunningTime="2025-12-02 07:48:56.663991128 +0000 UTC m=+184.448070000" Dec 02 07:48:56 crc kubenswrapper[4691]: I1202 07:48:56.678115 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8lqps" podStartSLOduration=166.678092399 podStartE2EDuration="2m46.678092399s" podCreationTimestamp="2025-12-02 07:46:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:48:56.675272456 +0000 UTC m=+184.459351318" watchObservedRunningTime="2025-12-02 07:48:56.678092399 +0000 UTC m=+184.462171261" Dec 02 07:48:57 crc kubenswrapper[4691]: I1202 07:48:57.651227 4691 generic.go:334] "Generic (PLEG): container finished" podID="180794db-347f-4628-9e7d-c7e80dc8378f" containerID="c6fcdb9358a56b8fef1d84897811b41767a485185d1ad9e205715363d0af7224" exitCode=0 Dec 02 07:48:57 crc kubenswrapper[4691]: I1202 07:48:57.651322 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180794db-347f-4628-9e7d-c7e80dc8378f","Type":"ContainerDied","Data":"c6fcdb9358a56b8fef1d84897811b41767a485185d1ad9e205715363d0af7224"} Dec 02 07:48:57 crc kubenswrapper[4691]: I1202 07:48:57.655002 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerStarted","Data":"2497f3f39c08e65fec13173cdbac570332d9aac6a98172d4c80d35a820e52bed"} Dec 02 07:48:57 crc kubenswrapper[4691]: I1202 07:48:57.657320 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerStarted","Data":"d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f"} Dec 02 07:48:57 crc kubenswrapper[4691]: I1202 07:48:57.696380 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jmzz7" podStartSLOduration=2.511790103 podStartE2EDuration="42.696360058s" podCreationTimestamp="2025-12-02 07:48:15 +0000 UTC" firstStartedPulling="2025-12-02 07:48:16.963363492 +0000 UTC m=+144.747442354" lastFinishedPulling="2025-12-02 07:48:57.147933447 +0000 UTC m=+184.932012309" observedRunningTime="2025-12-02 07:48:57.692299224 +0000 UTC m=+185.476378086" watchObservedRunningTime="2025-12-02 07:48:57.696360058 +0000 UTC m=+185.480438910" Dec 02 07:48:58 crc kubenswrapper[4691]: I1202 07:48:58.696661 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 02 07:48:58 crc kubenswrapper[4691]: I1202 07:48:58.714954 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nd4nh" podStartSLOduration=3.32143651 podStartE2EDuration="43.714933334s" podCreationTimestamp="2025-12-02 07:48:15 +0000 UTC" firstStartedPulling="2025-12-02 07:48:16.974344743 +0000 UTC m=+144.758423605" lastFinishedPulling="2025-12-02 07:48:57.367841567 +0000 UTC m=+185.151920429" observedRunningTime="2025-12-02 07:48:57.72185009 +0000 UTC m=+185.505928952" watchObservedRunningTime="2025-12-02 07:48:58.714933334 +0000 UTC m=+186.499012196" Dec 02 07:48:58 crc kubenswrapper[4691]: I1202 07:48:58.999374 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.155482 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180794db-347f-4628-9e7d-c7e80dc8378f-kube-api-access\") pod \"180794db-347f-4628-9e7d-c7e80dc8378f\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.155563 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180794db-347f-4628-9e7d-c7e80dc8378f-kubelet-dir\") pod \"180794db-347f-4628-9e7d-c7e80dc8378f\" (UID: \"180794db-347f-4628-9e7d-c7e80dc8378f\") " Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.155613 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/180794db-347f-4628-9e7d-c7e80dc8378f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "180794db-347f-4628-9e7d-c7e80dc8378f" (UID: "180794db-347f-4628-9e7d-c7e80dc8378f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.155852 4691 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/180794db-347f-4628-9e7d-c7e80dc8378f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.162851 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/180794db-347f-4628-9e7d-c7e80dc8378f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "180794db-347f-4628-9e7d-c7e80dc8378f" (UID: "180794db-347f-4628-9e7d-c7e80dc8378f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.256571 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/180794db-347f-4628-9e7d-c7e80dc8378f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.669524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"180794db-347f-4628-9e7d-c7e80dc8378f","Type":"ContainerDied","Data":"764a0eff34b9139ffd2cd66b04bac8c2ebf4d9317b64e8aced47f3e1ca7cd41a"} Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.669563 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="764a0eff34b9139ffd2cd66b04bac8c2ebf4d9317b64e8aced47f3e1ca7cd41a" Dec 02 07:48:59 crc kubenswrapper[4691]: I1202 07:48:59.669602 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.839600 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 07:49:00 crc kubenswrapper[4691]: E1202 07:49:00.840132 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="180794db-347f-4628-9e7d-c7e80dc8378f" containerName="pruner" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.840145 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="180794db-347f-4628-9e7d-c7e80dc8378f" containerName="pruner" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.840272 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="180794db-347f-4628-9e7d-c7e80dc8378f" containerName="pruner" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.840806 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.843611 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.843651 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.851549 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.979363 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-var-lock\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.979419 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:00 crc kubenswrapper[4691]: I1202 07:49:00.979439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.080417 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-var-lock\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.080470 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.080491 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.080533 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-var-lock\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.080594 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.102418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kube-api-access\") pod \"installer-9-crc\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.157106 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.600461 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 02 07:49:01 crc kubenswrapper[4691]: W1202 07:49:01.602029 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8c0f9d79_0ecc_4cba_86e4_8587b32f45b4.slice/crio-e1b66c0e1e53c4c5217583b40272a6ff74d2e408f92ed3f3ca1e7466f5884b77 WatchSource:0}: Error finding container e1b66c0e1e53c4c5217583b40272a6ff74d2e408f92ed3f3ca1e7466f5884b77: Status 404 returned error can't find the container with id e1b66c0e1e53c4c5217583b40272a6ff74d2e408f92ed3f3ca1e7466f5884b77 Dec 02 07:49:01 crc kubenswrapper[4691]: I1202 07:49:01.686681 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4","Type":"ContainerStarted","Data":"e1b66c0e1e53c4c5217583b40272a6ff74d2e408f92ed3f3ca1e7466f5884b77"} Dec 02 07:49:03 crc kubenswrapper[4691]: I1202 07:49:03.699559 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerStarted","Data":"c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1"} Dec 02 07:49:03 crc kubenswrapper[4691]: I1202 07:49:03.701088 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4","Type":"ContainerStarted","Data":"ce6128990e4aa202ed232a33b0aa079c05efb69e4654f388cacc4c80f319cb0c"} Dec 02 07:49:03 crc kubenswrapper[4691]: I1202 07:49:03.746318 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.746280821 podStartE2EDuration="3.746280821s" podCreationTimestamp="2025-12-02 07:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:49:03.743876039 +0000 UTC m=+191.527954901" watchObservedRunningTime="2025-12-02 07:49:03.746280821 +0000 UTC m=+191.530359683" Dec 02 07:49:04 crc kubenswrapper[4691]: I1202 07:49:04.708730 4691 generic.go:334] "Generic (PLEG): container finished" podID="ce744080-92af-4598-9f69-23dae48f82b5" containerID="c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1" exitCode=0 Dec 02 07:49:04 crc kubenswrapper[4691]: I1202 07:49:04.708816 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerDied","Data":"c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1"} Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.588241 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mcdgg"] Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.613869 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.613919 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.853128 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.899887 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.936593 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.936689 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:49:05 crc kubenswrapper[4691]: I1202 07:49:05.989369 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:49:06 crc kubenswrapper[4691]: I1202 07:49:06.723213 4691 generic.go:334] "Generic (PLEG): container finished" podID="81d25a09-ad32-4de2-860e-250010e610cb" containerID="561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae" exitCode=0 Dec 02 07:49:06 crc kubenswrapper[4691]: I1202 07:49:06.723296 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sc9f" event={"ID":"81d25a09-ad32-4de2-860e-250010e610cb","Type":"ContainerDied","Data":"561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae"} Dec 02 07:49:06 crc kubenswrapper[4691]: I1202 07:49:06.773124 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:49:09 crc kubenswrapper[4691]: I1202 07:49:09.156740 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nd4nh"] Dec 02 07:49:09 crc kubenswrapper[4691]: I1202 07:49:09.157267 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nd4nh" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="registry-server" containerID="cri-o://2497f3f39c08e65fec13173cdbac570332d9aac6a98172d4c80d35a820e52bed" gracePeriod=2 Dec 02 07:49:11 crc kubenswrapper[4691]: I1202 07:49:11.764084 4691 generic.go:334] "Generic (PLEG): container finished" podID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerID="2497f3f39c08e65fec13173cdbac570332d9aac6a98172d4c80d35a820e52bed" exitCode=0 Dec 02 07:49:11 crc kubenswrapper[4691]: I1202 07:49:11.764281 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerDied","Data":"2497f3f39c08e65fec13173cdbac570332d9aac6a98172d4c80d35a820e52bed"} Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.637967 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.737931 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfj7f\" (UniqueName: \"kubernetes.io/projected/103dafaf-ac5c-49e6-86c1-d604072e0ccc-kube-api-access-xfj7f\") pod \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.738017 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-catalog-content\") pod \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.738040 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-utilities\") pod \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\" (UID: \"103dafaf-ac5c-49e6-86c1-d604072e0ccc\") " Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.738976 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-utilities" (OuterVolumeSpecName: "utilities") pod "103dafaf-ac5c-49e6-86c1-d604072e0ccc" (UID: "103dafaf-ac5c-49e6-86c1-d604072e0ccc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.743573 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/103dafaf-ac5c-49e6-86c1-d604072e0ccc-kube-api-access-xfj7f" (OuterVolumeSpecName: "kube-api-access-xfj7f") pod "103dafaf-ac5c-49e6-86c1-d604072e0ccc" (UID: "103dafaf-ac5c-49e6-86c1-d604072e0ccc"). InnerVolumeSpecName "kube-api-access-xfj7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.780696 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nd4nh" event={"ID":"103dafaf-ac5c-49e6-86c1-d604072e0ccc","Type":"ContainerDied","Data":"3ded147b90fe056dca8a47d094d9f353ff0779f60febcbbd028ee8207a1ffcbc"} Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.780806 4691 scope.go:117] "RemoveContainer" containerID="2497f3f39c08e65fec13173cdbac570332d9aac6a98172d4c80d35a820e52bed" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.780965 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nd4nh" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.840374 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfj7f\" (UniqueName: \"kubernetes.io/projected/103dafaf-ac5c-49e6-86c1-d604072e0ccc-kube-api-access-xfj7f\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.840452 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.883553 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "103dafaf-ac5c-49e6-86c1-d604072e0ccc" (UID: "103dafaf-ac5c-49e6-86c1-d604072e0ccc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:12 crc kubenswrapper[4691]: I1202 07:49:12.940891 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/103dafaf-ac5c-49e6-86c1-d604072e0ccc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:13 crc kubenswrapper[4691]: I1202 07:49:13.117938 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nd4nh"] Dec 02 07:49:13 crc kubenswrapper[4691]: I1202 07:49:13.121828 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nd4nh"] Dec 02 07:49:14 crc kubenswrapper[4691]: I1202 07:49:14.567396 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" path="/var/lib/kubelet/pods/103dafaf-ac5c-49e6-86c1-d604072e0ccc/volumes" Dec 02 07:49:15 crc kubenswrapper[4691]: I1202 07:49:15.315838 4691 scope.go:117] "RemoveContainer" containerID="79d8f7137927f5f05485d7ce686cec2aa4afb6810e9bacdf99269a956e85e317" Dec 02 07:49:16 crc kubenswrapper[4691]: I1202 07:49:16.973567 4691 scope.go:117] "RemoveContainer" containerID="2a8a0f8e5fb6b43795134c109078a61538b3008a29a32e2a032579828277d6e4" Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.829290 4691 generic.go:334] "Generic (PLEG): container finished" podID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerID="446a5af40f2f6758da98de4b45d4dd6834a2cc39783fede09a08f8ebfce4e334" exitCode=0 Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.829358 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjm9m" event={"ID":"1ce6e92a-6368-4f1a-8926-88a28ff76460","Type":"ContainerDied","Data":"446a5af40f2f6758da98de4b45d4dd6834a2cc39783fede09a08f8ebfce4e334"} Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.833936 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sc9f" event={"ID":"81d25a09-ad32-4de2-860e-250010e610cb","Type":"ContainerStarted","Data":"be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4"} Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.836714 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerStarted","Data":"27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be"} Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.840818 4691 generic.go:334] "Generic (PLEG): container finished" podID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerID="eff7158962c9da587891c3fee426c28e8cf1f3fda5ef6440e20df303a804c0be" exitCode=0 Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.840913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54xgq" event={"ID":"b6c2ff9e-15cc-4fab-8039-b51552b052c0","Type":"ContainerDied","Data":"eff7158962c9da587891c3fee426c28e8cf1f3fda5ef6440e20df303a804c0be"} Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.852842 4691 generic.go:334] "Generic (PLEG): container finished" podID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerID="0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461" exitCode=0 Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.852956 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2cxnn" event={"ID":"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf","Type":"ContainerDied","Data":"0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461"} Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.855231 4691 generic.go:334] "Generic (PLEG): container finished" podID="aa15a612-ba38-4138-86e2-9840652f1724" containerID="9b56851a2e808ecb9b8a7ff1d7dc467b867cf38e59d4a807fc9de3544e0ec3ce" exitCode=0 Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.855268 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmd9f" event={"ID":"aa15a612-ba38-4138-86e2-9840652f1724","Type":"ContainerDied","Data":"9b56851a2e808ecb9b8a7ff1d7dc467b867cf38e59d4a807fc9de3544e0ec3ce"} Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.896424 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5sc9f" podStartSLOduration=5.718088348 podStartE2EDuration="1m6.896405342s" podCreationTimestamp="2025-12-02 07:48:13 +0000 UTC" firstStartedPulling="2025-12-02 07:48:15.894343744 +0000 UTC m=+143.678422606" lastFinishedPulling="2025-12-02 07:49:17.072660738 +0000 UTC m=+204.856739600" observedRunningTime="2025-12-02 07:49:19.874545571 +0000 UTC m=+207.658624433" watchObservedRunningTime="2025-12-02 07:49:19.896405342 +0000 UTC m=+207.680484204" Dec 02 07:49:19 crc kubenswrapper[4691]: I1202 07:49:19.897592 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-54mc7" podStartSLOduration=5.7863062549999995 podStartE2EDuration="1m7.897585622s" podCreationTimestamp="2025-12-02 07:48:12 +0000 UTC" firstStartedPulling="2025-12-02 07:48:14.862462067 +0000 UTC m=+142.646540929" lastFinishedPulling="2025-12-02 07:49:16.973741434 +0000 UTC m=+204.757820296" observedRunningTime="2025-12-02 07:49:19.896226317 +0000 UTC m=+207.680305179" watchObservedRunningTime="2025-12-02 07:49:19.897585622 +0000 UTC m=+207.681664484" Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.863473 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54xgq" event={"ID":"b6c2ff9e-15cc-4fab-8039-b51552b052c0","Type":"ContainerStarted","Data":"b41f6ceaef9271f604b7f561fc117b6454f17dab08e167afab61723fbe8b5f04"} Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.865600 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2cxnn" event={"ID":"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf","Type":"ContainerStarted","Data":"d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448"} Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.867595 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmd9f" event={"ID":"aa15a612-ba38-4138-86e2-9840652f1724","Type":"ContainerStarted","Data":"4e31bd7777daba754726a0895b9400efea6f08c6f04ec80bff46df388381684f"} Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.869401 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjm9m" event={"ID":"1ce6e92a-6368-4f1a-8926-88a28ff76460","Type":"ContainerStarted","Data":"61d00c4917ba25f4ef4fe71a4fee0a76c11ae1f8cccadb0690416a6179841e5e"} Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.904747 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-54xgq" podStartSLOduration=4.379906345 podStartE2EDuration="1m9.904727785s" podCreationTimestamp="2025-12-02 07:48:11 +0000 UTC" firstStartedPulling="2025-12-02 07:48:14.865237428 +0000 UTC m=+142.649316290" lastFinishedPulling="2025-12-02 07:49:20.390058868 +0000 UTC m=+208.174137730" observedRunningTime="2025-12-02 07:49:20.883899169 +0000 UTC m=+208.667978051" watchObservedRunningTime="2025-12-02 07:49:20.904727785 +0000 UTC m=+208.688806647" Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.904905 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmd9f" podStartSLOduration=3.395948472 podStartE2EDuration="1m8.904900139s" podCreationTimestamp="2025-12-02 07:48:12 +0000 UTC" firstStartedPulling="2025-12-02 07:48:14.871238412 +0000 UTC m=+142.655317274" lastFinishedPulling="2025-12-02 07:49:20.380190079 +0000 UTC m=+208.164268941" observedRunningTime="2025-12-02 07:49:20.90098717 +0000 UTC m=+208.685066052" watchObservedRunningTime="2025-12-02 07:49:20.904900139 +0000 UTC m=+208.688979011" Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.927340 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2cxnn" podStartSLOduration=2.405179091 podStartE2EDuration="1m6.927314744s" podCreationTimestamp="2025-12-02 07:48:14 +0000 UTC" firstStartedPulling="2025-12-02 07:48:15.88755336 +0000 UTC m=+143.671632222" lastFinishedPulling="2025-12-02 07:49:20.409689013 +0000 UTC m=+208.193767875" observedRunningTime="2025-12-02 07:49:20.922920003 +0000 UTC m=+208.706998865" watchObservedRunningTime="2025-12-02 07:49:20.927314744 +0000 UTC m=+208.711393606" Dec 02 07:49:20 crc kubenswrapper[4691]: I1202 07:49:20.941452 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bjm9m" podStartSLOduration=4.519847337 podStartE2EDuration="1m9.94143041s" podCreationTimestamp="2025-12-02 07:48:11 +0000 UTC" firstStartedPulling="2025-12-02 07:48:14.87507904 +0000 UTC m=+142.659157902" lastFinishedPulling="2025-12-02 07:49:20.296662113 +0000 UTC m=+208.080740975" observedRunningTime="2025-12-02 07:49:20.939338917 +0000 UTC m=+208.723417809" watchObservedRunningTime="2025-12-02 07:49:20.94143041 +0000 UTC m=+208.725509272" Dec 02 07:49:21 crc kubenswrapper[4691]: I1202 07:49:21.898566 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:49:21 crc kubenswrapper[4691]: I1202 07:49:21.898636 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:49:21 crc kubenswrapper[4691]: I1202 07:49:21.898688 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:49:21 crc kubenswrapper[4691]: I1202 07:49:21.899387 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:49:21 crc kubenswrapper[4691]: I1202 07:49:21.899502 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873" gracePeriod=600 Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.382696 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.385161 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.445526 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.448247 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.449651 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.608420 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.608652 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.653476 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.860392 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.860452 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.881118 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873" exitCode=0 Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.881230 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873"} Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.881269 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"bb78041367ae6920b31cd251da35411d957791f1be4b05d33750afac0123755e"} Dec 02 07:49:22 crc kubenswrapper[4691]: I1202 07:49:22.911645 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:49:23 crc kubenswrapper[4691]: I1202 07:49:23.494663 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-54xgq" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="registry-server" probeResult="failure" output=< Dec 02 07:49:23 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 07:49:23 crc kubenswrapper[4691]: > Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.294428 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.294808 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.339115 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.550182 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.550572 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.590365 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:49:24 crc kubenswrapper[4691]: I1202 07:49:24.932889 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:49:25 crc kubenswrapper[4691]: I1202 07:49:25.951079 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.157133 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2cxnn"] Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.157481 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2cxnn" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="registry-server" containerID="cri-o://d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448" gracePeriod=2 Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.650183 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.778458 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8j7dq\" (UniqueName: \"kubernetes.io/projected/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-kube-api-access-8j7dq\") pod \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.778556 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-catalog-content\") pod \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.778620 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-utilities\") pod \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\" (UID: \"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf\") " Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.779516 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-utilities" (OuterVolumeSpecName: "utilities") pod "d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" (UID: "d2fd2d03-d859-4c45-b7fa-b9af4816e1cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.791810 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-kube-api-access-8j7dq" (OuterVolumeSpecName: "kube-api-access-8j7dq") pod "d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" (UID: "d2fd2d03-d859-4c45-b7fa-b9af4816e1cf"). InnerVolumeSpecName "kube-api-access-8j7dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.806094 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" (UID: "d2fd2d03-d859-4c45-b7fa-b9af4816e1cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.879814 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8j7dq\" (UniqueName: \"kubernetes.io/projected/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-kube-api-access-8j7dq\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.879858 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.879867 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.920438 4691 generic.go:334] "Generic (PLEG): container finished" podID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerID="d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448" exitCode=0 Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.920490 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2cxnn" event={"ID":"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf","Type":"ContainerDied","Data":"d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448"} Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.920523 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2cxnn" event={"ID":"d2fd2d03-d859-4c45-b7fa-b9af4816e1cf","Type":"ContainerDied","Data":"f7c58e2eab913fd7c50cc10e5d44057449acd99dfec65be6a1311659857093b0"} Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.920535 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2cxnn" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.920544 4691 scope.go:117] "RemoveContainer" containerID="d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.935793 4691 scope.go:117] "RemoveContainer" containerID="0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.946401 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2cxnn"] Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.950701 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2cxnn"] Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.952297 4691 scope.go:117] "RemoveContainer" containerID="e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.970992 4691 scope.go:117] "RemoveContainer" containerID="d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448" Dec 02 07:49:28 crc kubenswrapper[4691]: E1202 07:49:28.971482 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448\": container with ID starting with d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448 not found: ID does not exist" containerID="d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.971527 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448"} err="failed to get container status \"d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448\": rpc error: code = NotFound desc = could not find container \"d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448\": container with ID starting with d5516f9cf1fd22d82d2f71871a813c5b5eb39fc1dbba6d2386d23499f9dca448 not found: ID does not exist" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.971563 4691 scope.go:117] "RemoveContainer" containerID="0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461" Dec 02 07:49:28 crc kubenswrapper[4691]: E1202 07:49:28.971858 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461\": container with ID starting with 0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461 not found: ID does not exist" containerID="0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.971907 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461"} err="failed to get container status \"0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461\": rpc error: code = NotFound desc = could not find container \"0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461\": container with ID starting with 0ca4511ea7e2c1e7c61e4dbe5ab01f9148f85e65fc38dd944cc92b8263705461 not found: ID does not exist" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.971939 4691 scope.go:117] "RemoveContainer" containerID="e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e" Dec 02 07:49:28 crc kubenswrapper[4691]: E1202 07:49:28.972371 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e\": container with ID starting with e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e not found: ID does not exist" containerID="e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e" Dec 02 07:49:28 crc kubenswrapper[4691]: I1202 07:49:28.972431 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e"} err="failed to get container status \"e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e\": rpc error: code = NotFound desc = could not find container \"e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e\": container with ID starting with e135096d86c87d23e4ab7c113d105b649730b46c9b9ee67dc2c7f28c53ee458e not found: ID does not exist" Dec 02 07:49:30 crc kubenswrapper[4691]: I1202 07:49:30.572162 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" path="/var/lib/kubelet/pods/d2fd2d03-d859-4c45-b7fa-b9af4816e1cf/volumes" Dec 02 07:49:30 crc kubenswrapper[4691]: I1202 07:49:30.615421 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" podUID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" containerName="oauth-openshift" containerID="cri-o://1b3c8e6ffa91fe202799d5dff384bfae5c144da309ca88387bd099eddb4fe097" gracePeriod=15 Dec 02 07:49:31 crc kubenswrapper[4691]: I1202 07:49:31.941611 4691 generic.go:334] "Generic (PLEG): container finished" podID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" containerID="1b3c8e6ffa91fe202799d5dff384bfae5c144da309ca88387bd099eddb4fe097" exitCode=0 Dec 02 07:49:31 crc kubenswrapper[4691]: I1202 07:49:31.941658 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" event={"ID":"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9","Type":"ContainerDied","Data":"1b3c8e6ffa91fe202799d5dff384bfae5c144da309ca88387bd099eddb4fe097"} Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.111126 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216210 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-service-ca\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216261 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-provider-selection\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216296 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-policies\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216326 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-login\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216348 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-error\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216392 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-dir\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216415 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-idp-0-file-data\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216467 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-serving-cert\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216504 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-ocp-branding-template\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216614 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-session\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216664 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-trusted-ca-bundle\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216687 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kd9dk\" (UniqueName: \"kubernetes.io/projected/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-kube-api-access-kd9dk\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216729 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-cliconfig\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.216794 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-router-certs\") pod \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\" (UID: \"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9\") " Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.217325 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.218257 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.218314 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.218412 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.218684 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.222381 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.222698 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-kube-api-access-kd9dk" (OuterVolumeSpecName: "kube-api-access-kd9dk") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "kube-api-access-kd9dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.222817 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.223067 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.223494 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.223776 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.226960 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.228100 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.232664 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" (UID: "1a9d5a6e-8110-4099-9906-eacf4a9e7ac9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318726 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318801 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318812 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318823 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318835 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kd9dk\" (UniqueName: \"kubernetes.io/projected/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-kube-api-access-kd9dk\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318843 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318853 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318861 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318871 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318883 4691 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318893 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318903 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318912 4691 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.318922 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.421183 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.506836 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.549475 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.645093 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.899548 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.947752 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" event={"ID":"1a9d5a6e-8110-4099-9906-eacf4a9e7ac9","Type":"ContainerDied","Data":"bc08c90ea1ca6d240504ac103cb6d917c914414f2b8dcaf6d17426d6a73554f1"} Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.947877 4691 scope.go:117] "RemoveContainer" containerID="1b3c8e6ffa91fe202799d5dff384bfae5c144da309ca88387bd099eddb4fe097" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.947780 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-mcdgg" Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.964691 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mcdgg"] Dec 02 07:49:32 crc kubenswrapper[4691]: I1202 07:49:32.967372 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-mcdgg"] Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371132 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b"] Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371361 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="registry-server" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371377 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="registry-server" Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371390 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" containerName="oauth-openshift" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371398 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" containerName="oauth-openshift" Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371413 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="registry-server" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371421 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="registry-server" Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371430 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="extract-utilities" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371438 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="extract-utilities" Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371453 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="extract-content" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371460 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="extract-content" Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371472 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="extract-utilities" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371478 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="extract-utilities" Dec 02 07:49:33 crc kubenswrapper[4691]: E1202 07:49:33.371489 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="extract-content" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371496 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="extract-content" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371612 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" containerName="oauth-openshift" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371633 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2fd2d03-d859-4c45-b7fa-b9af4816e1cf" containerName="registry-server" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.371644 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="103dafaf-ac5c-49e6-86c1-d604072e0ccc" containerName="registry-server" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.372151 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.373907 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375220 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375461 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375678 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375729 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375875 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375937 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.375992 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.377709 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.377904 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.378175 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.385397 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b"] Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.386149 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.386894 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.387673 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.395444 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533159 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-audit-policies\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533217 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533261 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533365 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533423 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533476 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533586 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533644 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533677 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533816 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-session\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533909 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-audit-dir\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.533971 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.534026 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.534106 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6gfx\" (UniqueName: \"kubernetes.io/projected/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-kube-api-access-x6gfx\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.636401 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-audit-policies\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.635752 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-audit-policies\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.636501 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.636532 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.636960 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.636981 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637004 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637035 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637055 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637072 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637092 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-session\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637115 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637130 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-audit-dir\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637145 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637166 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6gfx\" (UniqueName: \"kubernetes.io/projected/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-kube-api-access-x6gfx\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.637429 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-audit-dir\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.638171 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.638181 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.638483 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.640271 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-session\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.640337 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.640404 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.640663 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-error\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.641121 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.641110 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.642083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-user-template-login\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.643295 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.653521 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6gfx\" (UniqueName: \"kubernetes.io/projected/fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6-kube-api-access-x6gfx\") pod \"oauth-openshift-5fff7d8cf9-jrg2b\" (UID: \"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6\") " pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:33 crc kubenswrapper[4691]: I1202 07:49:33.687652 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:34 crc kubenswrapper[4691]: I1202 07:49:34.121804 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b"] Dec 02 07:49:34 crc kubenswrapper[4691]: W1202 07:49:34.131226 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd11f354_d9f0_47f6_bf96_f5e42b0cd7b6.slice/crio-3b70ca74e450574ee45a34db2bf4230ddb49848770e56fd8b07994b63fc71c46 WatchSource:0}: Error finding container 3b70ca74e450574ee45a34db2bf4230ddb49848770e56fd8b07994b63fc71c46: Status 404 returned error can't find the container with id 3b70ca74e450574ee45a34db2bf4230ddb49848770e56fd8b07994b63fc71c46 Dec 02 07:49:34 crc kubenswrapper[4691]: I1202 07:49:34.568558 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a9d5a6e-8110-4099-9906-eacf4a9e7ac9" path="/var/lib/kubelet/pods/1a9d5a6e-8110-4099-9906-eacf4a9e7ac9/volumes" Dec 02 07:49:34 crc kubenswrapper[4691]: I1202 07:49:34.961113 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" event={"ID":"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6","Type":"ContainerStarted","Data":"60f72efb31673bfc85697dc3b6ae2d97ed0e30e5c85ae5a8543cb4c6b6c00b7d"} Dec 02 07:49:34 crc kubenswrapper[4691]: I1202 07:49:34.961154 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" event={"ID":"fd11f354-d9f0-47f6-bf96-f5e42b0cd7b6","Type":"ContainerStarted","Data":"3b70ca74e450574ee45a34db2bf4230ddb49848770e56fd8b07994b63fc71c46"} Dec 02 07:49:34 crc kubenswrapper[4691]: I1202 07:49:34.962219 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:34 crc kubenswrapper[4691]: I1202 07:49:34.985324 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" podStartSLOduration=29.985305744 podStartE2EDuration="29.985305744s" podCreationTimestamp="2025-12-02 07:49:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:49:34.979449966 +0000 UTC m=+222.763528828" watchObservedRunningTime="2025-12-02 07:49:34.985305744 +0000 UTC m=+222.769384606" Dec 02 07:49:35 crc kubenswrapper[4691]: I1202 07:49:35.137427 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fff7d8cf9-jrg2b" Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.551680 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54mc7"] Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.552519 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-54mc7" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="registry-server" containerID="cri-o://27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be" gracePeriod=2 Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.753774 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmd9f"] Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.754593 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmd9f" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="registry-server" containerID="cri-o://4e31bd7777daba754726a0895b9400efea6f08c6f04ec80bff46df388381684f" gracePeriod=2 Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.907146 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.976092 4691 generic.go:334] "Generic (PLEG): container finished" podID="aa15a612-ba38-4138-86e2-9840652f1724" containerID="4e31bd7777daba754726a0895b9400efea6f08c6f04ec80bff46df388381684f" exitCode=0 Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.976174 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmd9f" event={"ID":"aa15a612-ba38-4138-86e2-9840652f1724","Type":"ContainerDied","Data":"4e31bd7777daba754726a0895b9400efea6f08c6f04ec80bff46df388381684f"} Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.978278 4691 generic.go:334] "Generic (PLEG): container finished" podID="ce744080-92af-4598-9f69-23dae48f82b5" containerID="27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be" exitCode=0 Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.978358 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-54mc7" Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.978366 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerDied","Data":"27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be"} Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.978413 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-54mc7" event={"ID":"ce744080-92af-4598-9f69-23dae48f82b5","Type":"ContainerDied","Data":"35cc602df2a7fd8be7fa823ea869d832002a544a8fe5c4aef6df000080c1ebc1"} Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.978442 4691 scope.go:117] "RemoveContainer" containerID="27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be" Dec 02 07:49:36 crc kubenswrapper[4691]: I1202 07:49:36.992255 4691 scope.go:117] "RemoveContainer" containerID="c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.010979 4691 scope.go:117] "RemoveContainer" containerID="074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.033931 4691 scope.go:117] "RemoveContainer" containerID="27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be" Dec 02 07:49:37 crc kubenswrapper[4691]: E1202 07:49:37.034295 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be\": container with ID starting with 27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be not found: ID does not exist" containerID="27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.034326 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be"} err="failed to get container status \"27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be\": rpc error: code = NotFound desc = could not find container \"27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be\": container with ID starting with 27c41b7e10b20f19e9a61283bd255aedc8ceb732fd44d65807531b90dbf748be not found: ID does not exist" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.034347 4691 scope.go:117] "RemoveContainer" containerID="c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1" Dec 02 07:49:37 crc kubenswrapper[4691]: E1202 07:49:37.034691 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1\": container with ID starting with c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1 not found: ID does not exist" containerID="c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.034746 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1"} err="failed to get container status \"c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1\": rpc error: code = NotFound desc = could not find container \"c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1\": container with ID starting with c895941dfd03588d43a5f19ebab831f331171e16cc5496f9834d7afb4ddd6bc1 not found: ID does not exist" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.034797 4691 scope.go:117] "RemoveContainer" containerID="074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7" Dec 02 07:49:37 crc kubenswrapper[4691]: E1202 07:49:37.035097 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7\": container with ID starting with 074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7 not found: ID does not exist" containerID="074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.035123 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7"} err="failed to get container status \"074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7\": rpc error: code = NotFound desc = could not find container \"074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7\": container with ID starting with 074d1adf4620936bf3d597fac456174136c093ee72e0d8887010b423607434c7 not found: ID does not exist" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.082189 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-utilities\") pod \"ce744080-92af-4598-9f69-23dae48f82b5\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.082240 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvkvq\" (UniqueName: \"kubernetes.io/projected/ce744080-92af-4598-9f69-23dae48f82b5-kube-api-access-gvkvq\") pod \"ce744080-92af-4598-9f69-23dae48f82b5\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.082283 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-catalog-content\") pod \"ce744080-92af-4598-9f69-23dae48f82b5\" (UID: \"ce744080-92af-4598-9f69-23dae48f82b5\") " Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.083249 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-utilities" (OuterVolumeSpecName: "utilities") pod "ce744080-92af-4598-9f69-23dae48f82b5" (UID: "ce744080-92af-4598-9f69-23dae48f82b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.087920 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce744080-92af-4598-9f69-23dae48f82b5-kube-api-access-gvkvq" (OuterVolumeSpecName: "kube-api-access-gvkvq") pod "ce744080-92af-4598-9f69-23dae48f82b5" (UID: "ce744080-92af-4598-9f69-23dae48f82b5"). InnerVolumeSpecName "kube-api-access-gvkvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.097853 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.132864 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce744080-92af-4598-9f69-23dae48f82b5" (UID: "ce744080-92af-4598-9f69-23dae48f82b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.183799 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.183836 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvkvq\" (UniqueName: \"kubernetes.io/projected/ce744080-92af-4598-9f69-23dae48f82b5-kube-api-access-gvkvq\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.183847 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce744080-92af-4598-9f69-23dae48f82b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.284579 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrkmq\" (UniqueName: \"kubernetes.io/projected/aa15a612-ba38-4138-86e2-9840652f1724-kube-api-access-zrkmq\") pod \"aa15a612-ba38-4138-86e2-9840652f1724\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.284664 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-utilities\") pod \"aa15a612-ba38-4138-86e2-9840652f1724\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.284792 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-catalog-content\") pod \"aa15a612-ba38-4138-86e2-9840652f1724\" (UID: \"aa15a612-ba38-4138-86e2-9840652f1724\") " Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.285988 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-utilities" (OuterVolumeSpecName: "utilities") pod "aa15a612-ba38-4138-86e2-9840652f1724" (UID: "aa15a612-ba38-4138-86e2-9840652f1724"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.289171 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa15a612-ba38-4138-86e2-9840652f1724-kube-api-access-zrkmq" (OuterVolumeSpecName: "kube-api-access-zrkmq") pod "aa15a612-ba38-4138-86e2-9840652f1724" (UID: "aa15a612-ba38-4138-86e2-9840652f1724"). InnerVolumeSpecName "kube-api-access-zrkmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.315912 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-54mc7"] Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.318425 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-54mc7"] Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.349329 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aa15a612-ba38-4138-86e2-9840652f1724" (UID: "aa15a612-ba38-4138-86e2-9840652f1724"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.386274 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.386327 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aa15a612-ba38-4138-86e2-9840652f1724-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.386345 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrkmq\" (UniqueName: \"kubernetes.io/projected/aa15a612-ba38-4138-86e2-9840652f1724-kube-api-access-zrkmq\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.987934 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmd9f" event={"ID":"aa15a612-ba38-4138-86e2-9840652f1724","Type":"ContainerDied","Data":"26822aeca534992ba11ddfda8209edb0b1a5a8b0c18c2bd257084ea49168d634"} Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.987979 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmd9f" Dec 02 07:49:37 crc kubenswrapper[4691]: I1202 07:49:37.988230 4691 scope.go:117] "RemoveContainer" containerID="4e31bd7777daba754726a0895b9400efea6f08c6f04ec80bff46df388381684f" Dec 02 07:49:38 crc kubenswrapper[4691]: I1202 07:49:38.016394 4691 scope.go:117] "RemoveContainer" containerID="9b56851a2e808ecb9b8a7ff1d7dc467b867cf38e59d4a807fc9de3544e0ec3ce" Dec 02 07:49:38 crc kubenswrapper[4691]: I1202 07:49:38.016889 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmd9f"] Dec 02 07:49:38 crc kubenswrapper[4691]: I1202 07:49:38.020790 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmd9f"] Dec 02 07:49:38 crc kubenswrapper[4691]: I1202 07:49:38.040395 4691 scope.go:117] "RemoveContainer" containerID="646f02725a6d68510243ecb1c33e4bc7bf44904652f9b2142ab845f5dff89e14" Dec 02 07:49:38 crc kubenswrapper[4691]: I1202 07:49:38.568727 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa15a612-ba38-4138-86e2-9840652f1724" path="/var/lib/kubelet/pods/aa15a612-ba38-4138-86e2-9840652f1724/volumes" Dec 02 07:49:38 crc kubenswrapper[4691]: I1202 07:49:38.569379 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce744080-92af-4598-9f69-23dae48f82b5" path="/var/lib/kubelet/pods/ce744080-92af-4598-9f69-23dae48f82b5/volumes" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.509298 4691 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.510080 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="extract-utilities" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510100 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="extract-utilities" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.510120 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="extract-content" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510126 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="extract-content" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.510136 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="registry-server" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510143 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="registry-server" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.510149 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="registry-server" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510156 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="registry-server" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.510171 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="extract-content" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510180 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="extract-content" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.510189 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="extract-utilities" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510196 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="extract-utilities" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510307 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce744080-92af-4598-9f69-23dae48f82b5" containerName="registry-server" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.510318 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa15a612-ba38-4138-86e2-9840652f1724" containerName="registry-server" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.522634 4691 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.522718 4691 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.522908 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.523595 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f" gracePeriod=15 Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.523635 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e" gracePeriod=15 Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.523646 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff" gracePeriod=15 Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.523675 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270" gracePeriod=15 Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.523615 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395" gracePeriod=15 Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.524570 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.524611 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.524629 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.524638 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.524657 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.524666 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.524686 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.524693 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.524701 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.524709 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.524721 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.527607 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.527684 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.527695 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528026 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528050 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528065 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528080 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528089 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528123 4691 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.528588 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.570781 4691 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640093 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640141 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640341 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640375 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640414 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640461 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.640480 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.741959 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742024 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742072 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742103 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742093 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742131 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742189 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742194 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742216 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742219 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742257 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742241 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742285 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742258 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.742351 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: I1202 07:49:40.872049 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:40 crc kubenswrapper[4691]: W1202 07:49:40.894428 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-cae3e7b2cc6beeef7d0c100e6192d99cbbe93ea628d44ef75e9ac67514063043 WatchSource:0}: Error finding container cae3e7b2cc6beeef7d0c100e6192d99cbbe93ea628d44ef75e9ac67514063043: Status 404 returned error can't find the container with id cae3e7b2cc6beeef7d0c100e6192d99cbbe93ea628d44ef75e9ac67514063043 Dec 02 07:49:40 crc kubenswrapper[4691]: E1202 07:49:40.899919 4691 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d568a471b758d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:49:40.897559949 +0000 UTC m=+228.681638811,LastTimestamp:2025-12-02 07:49:40.897559949 +0000 UTC m=+228.681638811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.010547 4691 generic.go:334] "Generic (PLEG): container finished" podID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" containerID="ce6128990e4aa202ed232a33b0aa079c05efb69e4654f388cacc4c80f319cb0c" exitCode=0 Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.010653 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4","Type":"ContainerDied","Data":"ce6128990e4aa202ed232a33b0aa079c05efb69e4654f388cacc4c80f319cb0c"} Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.011746 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.012611 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"cae3e7b2cc6beeef7d0c100e6192d99cbbe93ea628d44ef75e9ac67514063043"} Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.014748 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.016398 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.017279 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff" exitCode=0 Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.017303 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e" exitCode=0 Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.017313 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395" exitCode=0 Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.017323 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270" exitCode=2 Dec 02 07:49:41 crc kubenswrapper[4691]: I1202 07:49:41.017387 4691 scope.go:117] "RemoveContainer" containerID="ea4b1f076d045db689bf75dc1dcf394ed1e22808ff93df6b84f503ddcd6962df" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.027081 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590"} Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.029320 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:42 crc kubenswrapper[4691]: E1202 07:49:42.029495 4691 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.032219 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.299452 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.300685 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.373688 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kube-api-access\") pod \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.373733 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-var-lock\") pod \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.373801 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kubelet-dir\") pod \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\" (UID: \"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4\") " Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.373964 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-var-lock" (OuterVolumeSpecName: "var-lock") pod "8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" (UID: "8c0f9d79-0ecc-4cba-86e4-8587b32f45b4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.374031 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" (UID: "8c0f9d79-0ecc-4cba-86e4-8587b32f45b4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.380814 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" (UID: "8c0f9d79-0ecc-4cba-86e4-8587b32f45b4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.475399 4691 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.475433 4691 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.475443 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8c0f9d79-0ecc-4cba-86e4-8587b32f45b4-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.567522 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:42 crc kubenswrapper[4691]: E1202 07:49:42.569843 4691 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" volumeName="registry-storage" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.885460 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.887083 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.887692 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.888134 4691 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982337 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982402 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982477 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982462 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982492 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982510 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982785 4691 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982799 4691 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:42 crc kubenswrapper[4691]: I1202 07:49:42.982808 4691 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.045256 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8c0f9d79-0ecc-4cba-86e4-8587b32f45b4","Type":"ContainerDied","Data":"e1b66c0e1e53c4c5217583b40272a6ff74d2e408f92ed3f3ca1e7466f5884b77"} Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.045334 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.045344 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1b66c0e1e53c4c5217583b40272a6ff74d2e408f92ed3f3ca1e7466f5884b77" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.050085 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.051221 4691 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.051325 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f" exitCode=0 Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.051426 4691 scope.go:117] "RemoveContainer" containerID="f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.051470 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.051627 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.051807 4691 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.072002 4691 scope.go:117] "RemoveContainer" containerID="824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.076259 4691 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.077591 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.093946 4691 scope.go:117] "RemoveContainer" containerID="5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.111035 4691 scope.go:117] "RemoveContainer" containerID="52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.128982 4691 scope.go:117] "RemoveContainer" containerID="33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.148529 4691 scope.go:117] "RemoveContainer" containerID="3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.172080 4691 scope.go:117] "RemoveContainer" containerID="f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.172683 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\": container with ID starting with f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff not found: ID does not exist" containerID="f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.172733 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff"} err="failed to get container status \"f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\": rpc error: code = NotFound desc = could not find container \"f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff\": container with ID starting with f5bbb017763cb3be5b261e28c2c27e6e75df4f777d1e865fc2e92ed3453b9cff not found: ID does not exist" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.172795 4691 scope.go:117] "RemoveContainer" containerID="824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.173322 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\": container with ID starting with 824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e not found: ID does not exist" containerID="824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.173381 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e"} err="failed to get container status \"824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\": rpc error: code = NotFound desc = could not find container \"824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e\": container with ID starting with 824dfeb50ecaafdaf7ddb7a0881164b392ad7521df51631fe50594ffbecd029e not found: ID does not exist" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.173426 4691 scope.go:117] "RemoveContainer" containerID="5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.173798 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\": container with ID starting with 5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395 not found: ID does not exist" containerID="5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.173825 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395"} err="failed to get container status \"5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\": rpc error: code = NotFound desc = could not find container \"5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395\": container with ID starting with 5b4e5a6503010f3964fa93e26e97eb39e7ccc8cdb36ef7a5df7e82b27116c395 not found: ID does not exist" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.173841 4691 scope.go:117] "RemoveContainer" containerID="52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.174106 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\": container with ID starting with 52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270 not found: ID does not exist" containerID="52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.174143 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270"} err="failed to get container status \"52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\": rpc error: code = NotFound desc = could not find container \"52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270\": container with ID starting with 52957e30cd91ff936493ca755cc890f4986250befac38c97535da515a17d1270 not found: ID does not exist" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.174157 4691 scope.go:117] "RemoveContainer" containerID="33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.174418 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\": container with ID starting with 33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f not found: ID does not exist" containerID="33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.174435 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f"} err="failed to get container status \"33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\": rpc error: code = NotFound desc = could not find container \"33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f\": container with ID starting with 33b14291bdf3d3ae766bf3c3b47a248cad0261ccc3280ae58a236eca397b072f not found: ID does not exist" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.174449 4691 scope.go:117] "RemoveContainer" containerID="3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df" Dec 02 07:49:43 crc kubenswrapper[4691]: E1202 07:49:43.174697 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\": container with ID starting with 3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df not found: ID does not exist" containerID="3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df" Dec 02 07:49:43 crc kubenswrapper[4691]: I1202 07:49:43.174728 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df"} err="failed to get container status \"3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\": rpc error: code = NotFound desc = could not find container \"3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df\": container with ID starting with 3884ee056d757701f55cddab41b0ba9bebb99686574188d38b24cd30166ce9df not found: ID does not exist" Dec 02 07:49:44 crc kubenswrapper[4691]: I1202 07:49:44.569204 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.301521 4691 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.302034 4691 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.302278 4691 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.302658 4691 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.303134 4691 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:48 crc kubenswrapper[4691]: I1202 07:49:48.303159 4691 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.303379 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="200ms" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.504159 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="400ms" Dec 02 07:49:48 crc kubenswrapper[4691]: E1202 07:49:48.904931 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="800ms" Dec 02 07:49:49 crc kubenswrapper[4691]: E1202 07:49:49.706659 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="1.6s" Dec 02 07:49:49 crc kubenswrapper[4691]: E1202 07:49:49.783299 4691 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.222:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187d568a471b758d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-02 07:49:40.897559949 +0000 UTC m=+228.681638811,LastTimestamp:2025-12-02 07:49:40.897559949 +0000 UTC m=+228.681638811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 02 07:49:51 crc kubenswrapper[4691]: E1202 07:49:51.307300 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="3.2s" Dec 02 07:49:52 crc kubenswrapper[4691]: I1202 07:49:52.566264 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:53 crc kubenswrapper[4691]: I1202 07:49:53.107221 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 07:49:53 crc kubenswrapper[4691]: I1202 07:49:53.107280 4691 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab" exitCode=1 Dec 02 07:49:53 crc kubenswrapper[4691]: I1202 07:49:53.107314 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab"} Dec 02 07:49:53 crc kubenswrapper[4691]: I1202 07:49:53.107989 4691 scope.go:117] "RemoveContainer" containerID="fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab" Dec 02 07:49:53 crc kubenswrapper[4691]: I1202 07:49:53.108221 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:53 crc kubenswrapper[4691]: I1202 07:49:53.108629 4691 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.117609 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.118156 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5acbe2626928a9484a1270837fafe8abbd4a44a12ff7f53c93f5608f90b5b665"} Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.119687 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.120729 4691 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:54 crc kubenswrapper[4691]: E1202 07:49:54.508968 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.222:6443: connect: connection refused" interval="6.4s" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.561933 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.563053 4691 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.563950 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.590277 4691 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.590314 4691 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:49:54 crc kubenswrapper[4691]: E1202 07:49:54.591078 4691 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:54 crc kubenswrapper[4691]: I1202 07:49:54.592206 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.128841 4691 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b3a551004530f05ef1185585706a5d58f942958514a4aaceb46f1bd0ee42e248" exitCode=0 Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.129097 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b3a551004530f05ef1185585706a5d58f942958514a4aaceb46f1bd0ee42e248"} Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.129262 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"791c3631756a44d1f7265bad8f84e071405216b15a5abf20d96d0176fdb717cd"} Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.129523 4691 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.129536 4691 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:49:55 crc kubenswrapper[4691]: E1202 07:49:55.130362 4691 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.130461 4691 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:55 crc kubenswrapper[4691]: I1202 07:49:55.131026 4691 status_manager.go:851] "Failed to get status for pod" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.222:6443: connect: connection refused" Dec 02 07:49:56 crc kubenswrapper[4691]: I1202 07:49:56.137690 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b06ce850b6b92774a14367f737fc0bf6a196fd43e566a4a061672299a37628a5"} Dec 02 07:49:56 crc kubenswrapper[4691]: I1202 07:49:56.138180 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9cbee4b3fe000a49f6d78be5d59535947506ef5aa1b3b5ece3b112adce4dfb43"} Dec 02 07:49:56 crc kubenswrapper[4691]: I1202 07:49:56.138196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7eff65700d53f70aae52c883bce1823748bd05551a8ebb34eb06cee374d78754"} Dec 02 07:49:56 crc kubenswrapper[4691]: I1202 07:49:56.138208 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d52e548b5c91e56f3742f862e493ae1a064ddd6edfdef697245d7c1cf72ddc00"} Dec 02 07:49:57 crc kubenswrapper[4691]: I1202 07:49:57.149304 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bb8db32e02ac5ca5d65fcfcbd8886b44c684ea5c68285ce67ea50e2ed9387839"} Dec 02 07:49:57 crc kubenswrapper[4691]: I1202 07:49:57.149673 4691 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:49:57 crc kubenswrapper[4691]: I1202 07:49:57.149703 4691 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:49:57 crc kubenswrapper[4691]: I1202 07:49:57.149689 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:58 crc kubenswrapper[4691]: I1202 07:49:58.353229 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:49:59 crc kubenswrapper[4691]: I1202 07:49:59.445496 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:49:59 crc kubenswrapper[4691]: I1202 07:49:59.445857 4691 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 07:49:59 crc kubenswrapper[4691]: I1202 07:49:59.445935 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 07:49:59 crc kubenswrapper[4691]: I1202 07:49:59.592824 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:59 crc kubenswrapper[4691]: I1202 07:49:59.592879 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:49:59 crc kubenswrapper[4691]: I1202 07:49:59.597927 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:50:02 crc kubenswrapper[4691]: I1202 07:50:02.170094 4691 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:50:02 crc kubenswrapper[4691]: I1202 07:50:02.198550 4691 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:50:02 crc kubenswrapper[4691]: I1202 07:50:02.198581 4691 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:50:02 crc kubenswrapper[4691]: I1202 07:50:02.204174 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:50:02 crc kubenswrapper[4691]: I1202 07:50:02.591048 4691 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ed9022c5-0395-4a1d-87c2-dcbce53bc53f" Dec 02 07:50:03 crc kubenswrapper[4691]: I1202 07:50:03.202605 4691 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:50:03 crc kubenswrapper[4691]: I1202 07:50:03.202633 4691 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bfac83b6-bc8a-4d98-994c-4f982f3e0b4d" Dec 02 07:50:03 crc kubenswrapper[4691]: I1202 07:50:03.206284 4691 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ed9022c5-0395-4a1d-87c2-dcbce53bc53f" Dec 02 07:50:09 crc kubenswrapper[4691]: I1202 07:50:09.446036 4691 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 07:50:09 crc kubenswrapper[4691]: I1202 07:50:09.446361 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 07:50:11 crc kubenswrapper[4691]: I1202 07:50:11.571102 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 02 07:50:11 crc kubenswrapper[4691]: I1202 07:50:11.958722 4691 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 02 07:50:11 crc kubenswrapper[4691]: I1202 07:50:11.963960 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:50:11 crc kubenswrapper[4691]: I1202 07:50:11.964017 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 02 07:50:11 crc kubenswrapper[4691]: I1202 07:50:11.969092 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 02 07:50:12 crc kubenswrapper[4691]: I1202 07:50:12.015023 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=10.015005074 podStartE2EDuration="10.015005074s" podCreationTimestamp="2025-12-02 07:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:50:12.006856758 +0000 UTC m=+259.790935620" watchObservedRunningTime="2025-12-02 07:50:12.015005074 +0000 UTC m=+259.799083936" Dec 02 07:50:12 crc kubenswrapper[4691]: I1202 07:50:12.555187 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 02 07:50:12 crc kubenswrapper[4691]: I1202 07:50:12.700407 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 02 07:50:12 crc kubenswrapper[4691]: I1202 07:50:12.955687 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 02 07:50:12 crc kubenswrapper[4691]: I1202 07:50:12.999361 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 02 07:50:13 crc kubenswrapper[4691]: I1202 07:50:13.218223 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 02 07:50:13 crc kubenswrapper[4691]: I1202 07:50:13.493575 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 07:50:13 crc kubenswrapper[4691]: I1202 07:50:13.540257 4691 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 02 07:50:13 crc kubenswrapper[4691]: I1202 07:50:13.540489 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590" gracePeriod=5 Dec 02 07:50:13 crc kubenswrapper[4691]: I1202 07:50:13.927012 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 02 07:50:13 crc kubenswrapper[4691]: I1202 07:50:13.939243 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.008320 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.193352 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.352368 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.378286 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.587595 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.612311 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.653165 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.844848 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 02 07:50:14 crc kubenswrapper[4691]: I1202 07:50:14.967877 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.132602 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.217024 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.241610 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.366657 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.383734 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.397034 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.400833 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.840924 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.866003 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 02 07:50:15 crc kubenswrapper[4691]: I1202 07:50:15.965094 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.002485 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.105034 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.180664 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.186436 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.222064 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.242359 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.250309 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.264541 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.386746 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.444295 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.487203 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.508994 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.542244 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.689630 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.719732 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.818496 4691 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.823307 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.882048 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.888933 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.943697 4691 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.952734 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 02 07:50:16 crc kubenswrapper[4691]: I1202 07:50:16.986071 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.057078 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.074703 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.257458 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.311477 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.338805 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.403648 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.467230 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.467721 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.519108 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.568400 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.572708 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.670134 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.746259 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.923278 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 02 07:50:17 crc kubenswrapper[4691]: I1202 07:50:17.939959 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.065928 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.143205 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.150689 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.192195 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.193539 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.198380 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.255806 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.270055 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.295395 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.383999 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.453945 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.516950 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.550997 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.588251 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.706031 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.778484 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.963797 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 02 07:50:18 crc kubenswrapper[4691]: I1202 07:50:18.977605 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.003320 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.110803 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.123493 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.123561 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.140393 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.194738 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.197914 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.202687 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.262962 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.285887 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.289847 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.289906 4691 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590" exitCode=137 Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.289949 4691 scope.go:117] "RemoveContainer" containerID="c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.290029 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.301418 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304225 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304262 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304300 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304327 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304350 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304427 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304434 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304420 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304470 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304746 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304900 4691 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304923 4691 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304935 4691 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.304947 4691 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.307520 4691 scope.go:117] "RemoveContainer" containerID="c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590" Dec 02 07:50:19 crc kubenswrapper[4691]: E1202 07:50:19.308317 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590\": container with ID starting with c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590 not found: ID does not exist" containerID="c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.308379 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590"} err="failed to get container status \"c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590\": rpc error: code = NotFound desc = could not find container \"c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590\": container with ID starting with c9feff3d2193c7dca294d5b259dba80e2bfe3ebf643bef0df682be7b425f0590 not found: ID does not exist" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.313007 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.394407 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.403127 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.406174 4691 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.422879 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.446650 4691 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.446721 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.446810 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.448220 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"5acbe2626928a9484a1270837fafe8abbd4a44a12ff7f53c93f5608f90b5b665"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.448507 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://5acbe2626928a9484a1270837fafe8abbd4a44a12ff7f53c93f5608f90b5b665" gracePeriod=30 Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.505941 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.560097 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.561586 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.567320 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.602870 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.639992 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.685017 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.785050 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.844906 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.849419 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.857944 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.883092 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.890083 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 07:50:19 crc kubenswrapper[4691]: I1202 07:50:19.958114 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.044109 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.046171 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.054384 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.108474 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.183121 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.253399 4691 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.391022 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.397124 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.438530 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.438530 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.496994 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.523262 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.567208 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.591831 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.673130 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.754578 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.755608 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.796435 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.797125 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.861092 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.893906 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.925961 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.933942 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.934391 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.937511 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.946313 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 02 07:50:20 crc kubenswrapper[4691]: I1202 07:50:20.981888 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.001851 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.073369 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.168323 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.235122 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.258604 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.278039 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.287888 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.293173 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.305370 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.353524 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.539919 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.544553 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.594337 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.608477 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.627880 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.647400 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.657404 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.686392 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 02 07:50:21 crc kubenswrapper[4691]: I1202 07:50:21.840118 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.117991 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.186481 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.206195 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.206653 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.307928 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.372890 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.405022 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.541975 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.567563 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.598727 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.621834 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.640448 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.711634 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.712484 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.741898 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.783558 4691 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.792079 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.812250 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.834241 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.844160 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.894479 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.937963 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.941166 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 02 07:50:22 crc kubenswrapper[4691]: I1202 07:50:22.951552 4691 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.043263 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.044983 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.220773 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.261572 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.491443 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.549120 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.572610 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.593243 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.601679 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.613290 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.635882 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.700164 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.805856 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.851078 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.892309 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.897472 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 02 07:50:23 crc kubenswrapper[4691]: I1202 07:50:23.960648 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.117109 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.219303 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.226366 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.277503 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.286638 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.357700 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.396931 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.426052 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.588807 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.600003 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.622171 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.871692 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 02 07:50:24 crc kubenswrapper[4691]: I1202 07:50:24.948018 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.053416 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.187639 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.209670 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.410651 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.424602 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.426718 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.555069 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.636083 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.664109 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.738489 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.756052 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.779718 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.821297 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.895826 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.902350 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 07:50:25 crc kubenswrapper[4691]: I1202 07:50:25.923369 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.003158 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.060394 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.180845 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjm9m"] Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.181567 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bjm9m" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="registry-server" containerID="cri-o://61d00c4917ba25f4ef4fe71a4fee0a76c11ae1f8cccadb0690416a6179841e5e" gracePeriod=30 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.195828 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54xgq"] Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.196326 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-54xgq" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="registry-server" containerID="cri-o://b41f6ceaef9271f604b7f561fc117b6454f17dab08e167afab61723fbe8b5f04" gracePeriod=30 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.209780 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sv92r"] Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.209994 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" containerName="marketplace-operator" containerID="cri-o://64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c" gracePeriod=30 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.213753 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.216734 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sc9f"] Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.217036 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5sc9f" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="registry-server" containerID="cri-o://be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4" gracePeriod=30 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.220650 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jmzz7"] Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.220883 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jmzz7" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="registry-server" containerID="cri-o://d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f" gracePeriod=30 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.313552 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.331767 4691 generic.go:334] "Generic (PLEG): container finished" podID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerID="b41f6ceaef9271f604b7f561fc117b6454f17dab08e167afab61723fbe8b5f04" exitCode=0 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.331820 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54xgq" event={"ID":"b6c2ff9e-15cc-4fab-8039-b51552b052c0","Type":"ContainerDied","Data":"b41f6ceaef9271f604b7f561fc117b6454f17dab08e167afab61723fbe8b5f04"} Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.334166 4691 generic.go:334] "Generic (PLEG): container finished" podID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerID="61d00c4917ba25f4ef4fe71a4fee0a76c11ae1f8cccadb0690416a6179841e5e" exitCode=0 Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.334201 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjm9m" event={"ID":"1ce6e92a-6368-4f1a-8926-88a28ff76460","Type":"ContainerDied","Data":"61d00c4917ba25f4ef4fe71a4fee0a76c11ae1f8cccadb0690416a6179841e5e"} Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.422935 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.447489 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.546133 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.566269 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.636645 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.664563 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.679266 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.701644 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.750607 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.752348 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.756713 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.774521 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.795319 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4244w\" (UniqueName: \"kubernetes.io/projected/688a963d-2808-4961-a584-1ee4a3ada61d-kube-api-access-4244w\") pod \"688a963d-2808-4961-a584-1ee4a3ada61d\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.795362 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdgjb\" (UniqueName: \"kubernetes.io/projected/1ce6e92a-6368-4f1a-8926-88a28ff76460-kube-api-access-vdgjb\") pod \"1ce6e92a-6368-4f1a-8926-88a28ff76460\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.795425 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-operator-metrics\") pod \"688a963d-2808-4961-a584-1ee4a3ada61d\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.795494 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-utilities\") pod \"1ce6e92a-6368-4f1a-8926-88a28ff76460\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.795526 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-catalog-content\") pod \"1ce6e92a-6368-4f1a-8926-88a28ff76460\" (UID: \"1ce6e92a-6368-4f1a-8926-88a28ff76460\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.795547 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-trusted-ca\") pod \"688a963d-2808-4961-a584-1ee4a3ada61d\" (UID: \"688a963d-2808-4961-a584-1ee4a3ada61d\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.799904 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-utilities" (OuterVolumeSpecName: "utilities") pod "1ce6e92a-6368-4f1a-8926-88a28ff76460" (UID: "1ce6e92a-6368-4f1a-8926-88a28ff76460"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.803898 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "688a963d-2808-4961-a584-1ee4a3ada61d" (UID: "688a963d-2808-4961-a584-1ee4a3ada61d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.804747 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "688a963d-2808-4961-a584-1ee4a3ada61d" (UID: "688a963d-2808-4961-a584-1ee4a3ada61d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.805141 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/688a963d-2808-4961-a584-1ee4a3ada61d-kube-api-access-4244w" (OuterVolumeSpecName: "kube-api-access-4244w") pod "688a963d-2808-4961-a584-1ee4a3ada61d" (UID: "688a963d-2808-4961-a584-1ee4a3ada61d"). InnerVolumeSpecName "kube-api-access-4244w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.816279 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce6e92a-6368-4f1a-8926-88a28ff76460-kube-api-access-vdgjb" (OuterVolumeSpecName: "kube-api-access-vdgjb") pod "1ce6e92a-6368-4f1a-8926-88a28ff76460" (UID: "1ce6e92a-6368-4f1a-8926-88a28ff76460"). InnerVolumeSpecName "kube-api-access-vdgjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.855010 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1ce6e92a-6368-4f1a-8926-88a28ff76460" (UID: "1ce6e92a-6368-4f1a-8926-88a28ff76460"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.896904 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-catalog-content\") pod \"81d25a09-ad32-4de2-860e-250010e610cb\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.896981 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs7wc\" (UniqueName: \"kubernetes.io/projected/47426780-5ffb-47da-8a00-fb96b6a6099a-kube-api-access-rs7wc\") pod \"47426780-5ffb-47da-8a00-fb96b6a6099a\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897030 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-utilities\") pod \"81d25a09-ad32-4de2-860e-250010e610cb\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897060 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-utilities\") pod \"47426780-5ffb-47da-8a00-fb96b6a6099a\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897136 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-catalog-content\") pod \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897206 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4s6xq\" (UniqueName: \"kubernetes.io/projected/81d25a09-ad32-4de2-860e-250010e610cb-kube-api-access-4s6xq\") pod \"81d25a09-ad32-4de2-860e-250010e610cb\" (UID: \"81d25a09-ad32-4de2-860e-250010e610cb\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897287 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-catalog-content\") pod \"47426780-5ffb-47da-8a00-fb96b6a6099a\" (UID: \"47426780-5ffb-47da-8a00-fb96b6a6099a\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897318 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-utilities\") pod \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897373 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54h5k\" (UniqueName: \"kubernetes.io/projected/b6c2ff9e-15cc-4fab-8039-b51552b052c0-kube-api-access-54h5k\") pod \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\" (UID: \"b6c2ff9e-15cc-4fab-8039-b51552b052c0\") " Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897667 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897688 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1ce6e92a-6368-4f1a-8926-88a28ff76460-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897702 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897714 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4244w\" (UniqueName: \"kubernetes.io/projected/688a963d-2808-4961-a584-1ee4a3ada61d-kube-api-access-4244w\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897725 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdgjb\" (UniqueName: \"kubernetes.io/projected/1ce6e92a-6368-4f1a-8926-88a28ff76460-kube-api-access-vdgjb\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897765 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/688a963d-2808-4961-a584-1ee4a3ada61d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897754 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-utilities" (OuterVolumeSpecName: "utilities") pod "81d25a09-ad32-4de2-860e-250010e610cb" (UID: "81d25a09-ad32-4de2-860e-250010e610cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.897835 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-utilities" (OuterVolumeSpecName: "utilities") pod "47426780-5ffb-47da-8a00-fb96b6a6099a" (UID: "47426780-5ffb-47da-8a00-fb96b6a6099a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.898627 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-utilities" (OuterVolumeSpecName: "utilities") pod "b6c2ff9e-15cc-4fab-8039-b51552b052c0" (UID: "b6c2ff9e-15cc-4fab-8039-b51552b052c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.900459 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d25a09-ad32-4de2-860e-250010e610cb-kube-api-access-4s6xq" (OuterVolumeSpecName: "kube-api-access-4s6xq") pod "81d25a09-ad32-4de2-860e-250010e610cb" (UID: "81d25a09-ad32-4de2-860e-250010e610cb"). InnerVolumeSpecName "kube-api-access-4s6xq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.900509 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47426780-5ffb-47da-8a00-fb96b6a6099a-kube-api-access-rs7wc" (OuterVolumeSpecName: "kube-api-access-rs7wc") pod "47426780-5ffb-47da-8a00-fb96b6a6099a" (UID: "47426780-5ffb-47da-8a00-fb96b6a6099a"). InnerVolumeSpecName "kube-api-access-rs7wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.901183 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c2ff9e-15cc-4fab-8039-b51552b052c0-kube-api-access-54h5k" (OuterVolumeSpecName: "kube-api-access-54h5k") pod "b6c2ff9e-15cc-4fab-8039-b51552b052c0" (UID: "b6c2ff9e-15cc-4fab-8039-b51552b052c0"). InnerVolumeSpecName "kube-api-access-54h5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.916637 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "81d25a09-ad32-4de2-860e-250010e610cb" (UID: "81d25a09-ad32-4de2-860e-250010e610cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.925453 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.949833 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6c2ff9e-15cc-4fab-8039-b51552b052c0" (UID: "b6c2ff9e-15cc-4fab-8039-b51552b052c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999173 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999207 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54h5k\" (UniqueName: \"kubernetes.io/projected/b6c2ff9e-15cc-4fab-8039-b51552b052c0-kube-api-access-54h5k\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999224 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999237 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs7wc\" (UniqueName: \"kubernetes.io/projected/47426780-5ffb-47da-8a00-fb96b6a6099a-kube-api-access-rs7wc\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999248 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/81d25a09-ad32-4de2-860e-250010e610cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999290 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999300 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c2ff9e-15cc-4fab-8039-b51552b052c0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:26 crc kubenswrapper[4691]: I1202 07:50:26.999312 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4s6xq\" (UniqueName: \"kubernetes.io/projected/81d25a09-ad32-4de2-860e-250010e610cb-kube-api-access-4s6xq\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.016938 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47426780-5ffb-47da-8a00-fb96b6a6099a" (UID: "47426780-5ffb-47da-8a00-fb96b6a6099a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.070600 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.100395 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47426780-5ffb-47da-8a00-fb96b6a6099a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.120384 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.229306 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.343085 4691 generic.go:334] "Generic (PLEG): container finished" podID="81d25a09-ad32-4de2-860e-250010e610cb" containerID="be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4" exitCode=0 Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.343134 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sc9f" event={"ID":"81d25a09-ad32-4de2-860e-250010e610cb","Type":"ContainerDied","Data":"be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.343196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sc9f" event={"ID":"81d25a09-ad32-4de2-860e-250010e610cb","Type":"ContainerDied","Data":"656c308f86c07359a948683d0723b26f2afb31374b8e352dd6a8d8041a3f670e"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.343251 4691 scope.go:117] "RemoveContainer" containerID="be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.343380 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sc9f" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.346020 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-54xgq" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.346513 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-54xgq" event={"ID":"b6c2ff9e-15cc-4fab-8039-b51552b052c0","Type":"ContainerDied","Data":"02e8895f04712c2534092602cd9f0342aa4177b4482010bb038dc2a57d77bb0b"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.349553 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bjm9m" event={"ID":"1ce6e92a-6368-4f1a-8926-88a28ff76460","Type":"ContainerDied","Data":"ffaaa3a5a715cf02e4244df4ea7296012780a895efe13c535c80324509f3a0d2"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.349739 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bjm9m" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.357522 4691 generic.go:334] "Generic (PLEG): container finished" podID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerID="d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f" exitCode=0 Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.357665 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jmzz7" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.357690 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerDied","Data":"d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.357978 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jmzz7" event={"ID":"47426780-5ffb-47da-8a00-fb96b6a6099a","Type":"ContainerDied","Data":"04cb794ddda0de9b9bae776211c7e73b79ba3cbb0ce892ad72c315da1d649a16"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.361291 4691 generic.go:334] "Generic (PLEG): container finished" podID="688a963d-2808-4961-a584-1ee4a3ada61d" containerID="64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c" exitCode=0 Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.361330 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" event={"ID":"688a963d-2808-4961-a584-1ee4a3ada61d","Type":"ContainerDied","Data":"64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.361352 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" event={"ID":"688a963d-2808-4961-a584-1ee4a3ada61d","Type":"ContainerDied","Data":"73f7ae1975982d278b7f710642b6a634f02d8ab9a80b112c5428a80f69b87e5b"} Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.361353 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-sv92r" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.375769 4691 scope.go:117] "RemoveContainer" containerID="561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.390866 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-54xgq"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.393907 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-54xgq"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.401288 4691 scope.go:117] "RemoveContainer" containerID="f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.404066 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bjm9m"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.410454 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bjm9m"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.420533 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sc9f"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.424442 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sc9f"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.424495 4691 scope.go:117] "RemoveContainer" containerID="be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.425052 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4\": container with ID starting with be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4 not found: ID does not exist" containerID="be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.425083 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4"} err="failed to get container status \"be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4\": rpc error: code = NotFound desc = could not find container \"be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4\": container with ID starting with be155d0c4822f1a753d34294e26e60b3877472a831d2060eac2c3e5d7f7c5ad4 not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.425108 4691 scope.go:117] "RemoveContainer" containerID="561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.425585 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae\": container with ID starting with 561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae not found: ID does not exist" containerID="561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.425632 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae"} err="failed to get container status \"561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae\": rpc error: code = NotFound desc = could not find container \"561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae\": container with ID starting with 561cc6033e5a2d94805428e20a58fbfdf58f14cffdb52442dc892cdf6df987ae not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.425662 4691 scope.go:117] "RemoveContainer" containerID="f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.425930 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48\": container with ID starting with f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48 not found: ID does not exist" containerID="f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.425963 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48"} err="failed to get container status \"f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48\": rpc error: code = NotFound desc = could not find container \"f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48\": container with ID starting with f3ae1cacbf19ec412bcb24480a55311b1aff1ccdb6f10cc563ad26acca924d48 not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.425985 4691 scope.go:117] "RemoveContainer" containerID="b41f6ceaef9271f604b7f561fc117b6454f17dab08e167afab61723fbe8b5f04" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.427489 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jmzz7"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.438226 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jmzz7"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.443835 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sv92r"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.445599 4691 scope.go:117] "RemoveContainer" containerID="eff7158962c9da587891c3fee426c28e8cf1f3fda5ef6440e20df303a804c0be" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.448113 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-sv92r"] Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.460706 4691 scope.go:117] "RemoveContainer" containerID="f56bae39991695327afadbc7b0f3fd3ae378d0d75eb256f89ebb6da23e80a2f0" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.475422 4691 scope.go:117] "RemoveContainer" containerID="61d00c4917ba25f4ef4fe71a4fee0a76c11ae1f8cccadb0690416a6179841e5e" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.489907 4691 scope.go:117] "RemoveContainer" containerID="446a5af40f2f6758da98de4b45d4dd6834a2cc39783fede09a08f8ebfce4e334" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.502899 4691 scope.go:117] "RemoveContainer" containerID="f0b5e3a5e90e9ee1b3de41394161451e0bf577e8a4e6067c6c73c2f5b2d619c9" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.518641 4691 scope.go:117] "RemoveContainer" containerID="d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.533603 4691 scope.go:117] "RemoveContainer" containerID="63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.542672 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.549508 4691 scope.go:117] "RemoveContainer" containerID="6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.561812 4691 scope.go:117] "RemoveContainer" containerID="d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.562276 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f\": container with ID starting with d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f not found: ID does not exist" containerID="d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.562319 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f"} err="failed to get container status \"d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f\": rpc error: code = NotFound desc = could not find container \"d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f\": container with ID starting with d53dae7a4550f06f6caa8f4acc5ec5a735f9fc1b53864fc53f0690ebe05bd39f not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.562348 4691 scope.go:117] "RemoveContainer" containerID="63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.562708 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6\": container with ID starting with 63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6 not found: ID does not exist" containerID="63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.562788 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6"} err="failed to get container status \"63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6\": rpc error: code = NotFound desc = could not find container \"63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6\": container with ID starting with 63dcdbdccdb852cfc1c7fbd73e99467f1061ed49f91ad3e795004125458da9c6 not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.562824 4691 scope.go:117] "RemoveContainer" containerID="6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.563239 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2\": container with ID starting with 6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2 not found: ID does not exist" containerID="6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.563269 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2"} err="failed to get container status \"6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2\": rpc error: code = NotFound desc = could not find container \"6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2\": container with ID starting with 6734ca0fedf7608ddcbc6a6e75cddd2d90992408ee638e0ff3397b0f8cc3e8d2 not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.563293 4691 scope.go:117] "RemoveContainer" containerID="64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.577953 4691 scope.go:117] "RemoveContainer" containerID="64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c" Dec 02 07:50:27 crc kubenswrapper[4691]: E1202 07:50:27.578344 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c\": container with ID starting with 64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c not found: ID does not exist" containerID="64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.578373 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c"} err="failed to get container status \"64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c\": rpc error: code = NotFound desc = could not find container \"64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c\": container with ID starting with 64dbfe63764c8f651d7f96e7ec5026f911005336208044f2bd913ac8d1210e5c not found: ID does not exist" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.644850 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.744800 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 02 07:50:27 crc kubenswrapper[4691]: I1202 07:50:27.987201 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.195517 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.343223 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.393492 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.433680 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.568475 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" path="/var/lib/kubelet/pods/1ce6e92a-6368-4f1a-8926-88a28ff76460/volumes" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.569368 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" path="/var/lib/kubelet/pods/47426780-5ffb-47da-8a00-fb96b6a6099a/volumes" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.570030 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" path="/var/lib/kubelet/pods/688a963d-2808-4961-a584-1ee4a3ada61d/volumes" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.570453 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d25a09-ad32-4de2-860e-250010e610cb" path="/var/lib/kubelet/pods/81d25a09-ad32-4de2-860e-250010e610cb/volumes" Dec 02 07:50:28 crc kubenswrapper[4691]: I1202 07:50:28.571050 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" path="/var/lib/kubelet/pods/b6c2ff9e-15cc-4fab-8039-b51552b052c0/volumes" Dec 02 07:50:29 crc kubenswrapper[4691]: I1202 07:50:29.380873 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 02 07:50:29 crc kubenswrapper[4691]: I1202 07:50:29.799323 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 02 07:50:49 crc kubenswrapper[4691]: I1202 07:50:49.554744 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 02 07:50:49 crc kubenswrapper[4691]: I1202 07:50:49.557116 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 02 07:50:49 crc kubenswrapper[4691]: I1202 07:50:49.557178 4691 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5acbe2626928a9484a1270837fafe8abbd4a44a12ff7f53c93f5608f90b5b665" exitCode=137 Dec 02 07:50:49 crc kubenswrapper[4691]: I1202 07:50:49.557218 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5acbe2626928a9484a1270837fafe8abbd4a44a12ff7f53c93f5608f90b5b665"} Dec 02 07:50:49 crc kubenswrapper[4691]: I1202 07:50:49.557262 4691 scope.go:117] "RemoveContainer" containerID="fd1385baa8ed044a5e912aae84016c76d6914cd37c1ea9b09bf37c9db1e2a8ab" Dec 02 07:50:50 crc kubenswrapper[4691]: I1202 07:50:50.565619 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Dec 02 07:50:50 crc kubenswrapper[4691]: I1202 07:50:50.572447 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d1a74045dc42ad05f41f9753c53d5e6773e40aa3a26c577a317bc39db7cbca4b"} Dec 02 07:50:58 crc kubenswrapper[4691]: I1202 07:50:58.353641 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:50:59 crc kubenswrapper[4691]: I1202 07:50:59.445793 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:50:59 crc kubenswrapper[4691]: I1202 07:50:59.451329 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.981185 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kb4km"] Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.983303 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.983407 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.983482 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.983556 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.983645 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.983728 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.983841 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.983929 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.984009 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.984092 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.984199 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.984283 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.984367 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.984449 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.984529 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" containerName="installer" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.984687 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" containerName="installer" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.984800 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.984897 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.984980 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.985061 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.985181 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.985272 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="extract-content" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.985354 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.985534 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.985622 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.985707 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.985812 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.985905 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="extract-utilities" Dec 02 07:51:07 crc kubenswrapper[4691]: E1202 07:51:07.985996 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" containerName="marketplace-operator" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986084 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" containerName="marketplace-operator" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986321 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce6e92a-6368-4f1a-8926-88a28ff76460" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986505 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c2ff9e-15cc-4fab-8039-b51552b052c0" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986607 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c0f9d79-0ecc-4cba-86e4-8587b32f45b4" containerName="installer" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986699 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986794 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="47426780-5ffb-47da-8a00-fb96b6a6099a" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986883 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="688a963d-2808-4961-a584-1ee4a3ada61d" containerName="marketplace-operator" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.986961 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d25a09-ad32-4de2-860e-250010e610cb" containerName="registry-server" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.987574 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.990509 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 02 07:51:07 crc kubenswrapper[4691]: I1202 07:51:07.995005 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kb4km"] Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.008082 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.008965 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.009212 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.009041 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.011834 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-87dnb"] Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.012074 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" podUID="eb7b09b1-f092-4e2b-8b07-e03343753503" containerName="controller-manager" containerID="cri-o://8dd7dbb6b386629f42362837e1ccd1f13f2f2de51b389168dc847c9d4ba16374" gracePeriod=30 Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.014995 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb"] Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.015176 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" podUID="ea5aa03a-f69e-4e94-8586-de42593bce47" containerName="route-controller-manager" containerID="cri-o://c0f5ab488dad468beb72deb8f9314317982cf2cc0e9367198c1c1b95bafc427a" gracePeriod=30 Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.103023 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qvcq\" (UniqueName: \"kubernetes.io/projected/411ccc93-fc18-44f5-b96f-f2da874ae9be-kube-api-access-6qvcq\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.103068 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/411ccc93-fc18-44f5-b96f-f2da874ae9be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.103103 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411ccc93-fc18-44f5-b96f-f2da874ae9be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.203999 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411ccc93-fc18-44f5-b96f-f2da874ae9be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.204110 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qvcq\" (UniqueName: \"kubernetes.io/projected/411ccc93-fc18-44f5-b96f-f2da874ae9be-kube-api-access-6qvcq\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.204136 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/411ccc93-fc18-44f5-b96f-f2da874ae9be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.205398 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/411ccc93-fc18-44f5-b96f-f2da874ae9be-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.213312 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/411ccc93-fc18-44f5-b96f-f2da874ae9be-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.229537 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qvcq\" (UniqueName: \"kubernetes.io/projected/411ccc93-fc18-44f5-b96f-f2da874ae9be-kube-api-access-6qvcq\") pod \"marketplace-operator-79b997595-kb4km\" (UID: \"411ccc93-fc18-44f5-b96f-f2da874ae9be\") " pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.313131 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.358591 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.691963 4691 generic.go:334] "Generic (PLEG): container finished" podID="ea5aa03a-f69e-4e94-8586-de42593bce47" containerID="c0f5ab488dad468beb72deb8f9314317982cf2cc0e9367198c1c1b95bafc427a" exitCode=0 Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.692012 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" event={"ID":"ea5aa03a-f69e-4e94-8586-de42593bce47","Type":"ContainerDied","Data":"c0f5ab488dad468beb72deb8f9314317982cf2cc0e9367198c1c1b95bafc427a"} Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.695365 4691 generic.go:334] "Generic (PLEG): container finished" podID="eb7b09b1-f092-4e2b-8b07-e03343753503" containerID="8dd7dbb6b386629f42362837e1ccd1f13f2f2de51b389168dc847c9d4ba16374" exitCode=0 Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.695402 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" event={"ID":"eb7b09b1-f092-4e2b-8b07-e03343753503","Type":"ContainerDied","Data":"8dd7dbb6b386629f42362837e1ccd1f13f2f2de51b389168dc847c9d4ba16374"} Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.782467 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kb4km"] Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.904355 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:51:08 crc kubenswrapper[4691]: I1202 07:51:08.930530 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.013547 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea5aa03a-f69e-4e94-8586-de42593bce47-serving-cert\") pod \"ea5aa03a-f69e-4e94-8586-de42593bce47\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.013872 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdhlb\" (UniqueName: \"kubernetes.io/projected/ea5aa03a-f69e-4e94-8586-de42593bce47-kube-api-access-bdhlb\") pod \"ea5aa03a-f69e-4e94-8586-de42593bce47\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.013914 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7b09b1-f092-4e2b-8b07-e03343753503-serving-cert\") pod \"eb7b09b1-f092-4e2b-8b07-e03343753503\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.013959 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wkm8d\" (UniqueName: \"kubernetes.io/projected/eb7b09b1-f092-4e2b-8b07-e03343753503-kube-api-access-wkm8d\") pod \"eb7b09b1-f092-4e2b-8b07-e03343753503\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.013986 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-config\") pod \"ea5aa03a-f69e-4e94-8586-de42593bce47\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014005 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-client-ca\") pod \"eb7b09b1-f092-4e2b-8b07-e03343753503\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014034 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-proxy-ca-bundles\") pod \"eb7b09b1-f092-4e2b-8b07-e03343753503\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014081 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-config\") pod \"eb7b09b1-f092-4e2b-8b07-e03343753503\" (UID: \"eb7b09b1-f092-4e2b-8b07-e03343753503\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014117 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-client-ca\") pod \"ea5aa03a-f69e-4e94-8586-de42593bce47\" (UID: \"ea5aa03a-f69e-4e94-8586-de42593bce47\") " Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014720 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-config" (OuterVolumeSpecName: "config") pod "ea5aa03a-f69e-4e94-8586-de42593bce47" (UID: "ea5aa03a-f69e-4e94-8586-de42593bce47"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014719 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-client-ca" (OuterVolumeSpecName: "client-ca") pod "eb7b09b1-f092-4e2b-8b07-e03343753503" (UID: "eb7b09b1-f092-4e2b-8b07-e03343753503"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014834 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-client-ca" (OuterVolumeSpecName: "client-ca") pod "ea5aa03a-f69e-4e94-8586-de42593bce47" (UID: "ea5aa03a-f69e-4e94-8586-de42593bce47"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014897 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "eb7b09b1-f092-4e2b-8b07-e03343753503" (UID: "eb7b09b1-f092-4e2b-8b07-e03343753503"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.014985 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-config" (OuterVolumeSpecName: "config") pod "eb7b09b1-f092-4e2b-8b07-e03343753503" (UID: "eb7b09b1-f092-4e2b-8b07-e03343753503"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.019258 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea5aa03a-f69e-4e94-8586-de42593bce47-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ea5aa03a-f69e-4e94-8586-de42593bce47" (UID: "ea5aa03a-f69e-4e94-8586-de42593bce47"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.019355 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea5aa03a-f69e-4e94-8586-de42593bce47-kube-api-access-bdhlb" (OuterVolumeSpecName: "kube-api-access-bdhlb") pod "ea5aa03a-f69e-4e94-8586-de42593bce47" (UID: "ea5aa03a-f69e-4e94-8586-de42593bce47"). InnerVolumeSpecName "kube-api-access-bdhlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.019711 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb7b09b1-f092-4e2b-8b07-e03343753503-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eb7b09b1-f092-4e2b-8b07-e03343753503" (UID: "eb7b09b1-f092-4e2b-8b07-e03343753503"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.020205 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb7b09b1-f092-4e2b-8b07-e03343753503-kube-api-access-wkm8d" (OuterVolumeSpecName: "kube-api-access-wkm8d") pod "eb7b09b1-f092-4e2b-8b07-e03343753503" (UID: "eb7b09b1-f092-4e2b-8b07-e03343753503"). InnerVolumeSpecName "kube-api-access-wkm8d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115117 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wkm8d\" (UniqueName: \"kubernetes.io/projected/eb7b09b1-f092-4e2b-8b07-e03343753503-kube-api-access-wkm8d\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115150 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115162 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115171 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115180 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eb7b09b1-f092-4e2b-8b07-e03343753503-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115189 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ea5aa03a-f69e-4e94-8586-de42593bce47-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115196 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea5aa03a-f69e-4e94-8586-de42593bce47-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115204 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdhlb\" (UniqueName: \"kubernetes.io/projected/ea5aa03a-f69e-4e94-8586-de42593bce47-kube-api-access-bdhlb\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.115212 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eb7b09b1-f092-4e2b-8b07-e03343753503-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.436275 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw"] Dec 02 07:51:09 crc kubenswrapper[4691]: E1202 07:51:09.436807 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea5aa03a-f69e-4e94-8586-de42593bce47" containerName="route-controller-manager" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.436933 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea5aa03a-f69e-4e94-8586-de42593bce47" containerName="route-controller-manager" Dec 02 07:51:09 crc kubenswrapper[4691]: E1202 07:51:09.437027 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb7b09b1-f092-4e2b-8b07-e03343753503" containerName="controller-manager" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.437114 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb7b09b1-f092-4e2b-8b07-e03343753503" containerName="controller-manager" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.437320 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea5aa03a-f69e-4e94-8586-de42593bce47" containerName="route-controller-manager" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.437440 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb7b09b1-f092-4e2b-8b07-e03343753503" containerName="controller-manager" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.437960 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.439896 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-645d6d6c79-rgjxb"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.440549 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.456267 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-645d6d6c79-rgjxb"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.459869 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.520179 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b5bbd65-e255-4795-91b8-2bbbaf130f24-serving-cert\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.520480 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-client-ca\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.520627 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5gtp\" (UniqueName: \"kubernetes.io/projected/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-kube-api-access-r5gtp\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.520745 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-proxy-ca-bundles\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.520892 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-serving-cert\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.520999 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-config\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.521145 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zch5d\" (UniqueName: \"kubernetes.io/projected/9b5bbd65-e255-4795-91b8-2bbbaf130f24-kube-api-access-zch5d\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.521271 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-config\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.521543 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-client-ca\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622770 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-serving-cert\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622837 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-config\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622870 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zch5d\" (UniqueName: \"kubernetes.io/projected/9b5bbd65-e255-4795-91b8-2bbbaf130f24-kube-api-access-zch5d\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622894 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-config\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622919 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-client-ca\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622948 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b5bbd65-e255-4795-91b8-2bbbaf130f24-serving-cert\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622964 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-client-ca\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.622985 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5gtp\" (UniqueName: \"kubernetes.io/projected/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-kube-api-access-r5gtp\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.623003 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-proxy-ca-bundles\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.624096 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-client-ca\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.624101 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-proxy-ca-bundles\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.624219 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-client-ca\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.625243 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-config\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.625434 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-config\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.627772 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b5bbd65-e255-4795-91b8-2bbbaf130f24-serving-cert\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.632906 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-serving-cert\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.643435 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zch5d\" (UniqueName: \"kubernetes.io/projected/9b5bbd65-e255-4795-91b8-2bbbaf130f24-kube-api-access-zch5d\") pod \"route-controller-manager-75f94579cc-gw5rw\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.646566 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5gtp\" (UniqueName: \"kubernetes.io/projected/bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7-kube-api-access-r5gtp\") pod \"controller-manager-645d6d6c79-rgjxb\" (UID: \"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7\") " pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.700701 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" event={"ID":"ea5aa03a-f69e-4e94-8586-de42593bce47","Type":"ContainerDied","Data":"18196a9d7ff9a3781e07d6f0821292bc591f19190871b778ca23e156d4455ad1"} Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.700754 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.700818 4691 scope.go:117] "RemoveContainer" containerID="c0f5ab488dad468beb72deb8f9314317982cf2cc0e9367198c1c1b95bafc427a" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.707668 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" event={"ID":"411ccc93-fc18-44f5-b96f-f2da874ae9be","Type":"ContainerStarted","Data":"5a5aa00f4ae9d8a7868ff67e2b5bcdd692f4036a6b98b5cc5a09676729b1ba9a"} Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.707722 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.707733 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" event={"ID":"411ccc93-fc18-44f5-b96f-f2da874ae9be","Type":"ContainerStarted","Data":"f8330945ef5643d523a36da7bcb8c9d2ad75b68cdb853e1eedfdba6b736d3267"} Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.710454 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" event={"ID":"eb7b09b1-f092-4e2b-8b07-e03343753503","Type":"ContainerDied","Data":"5ff4b82205f9feca9c0e16717be9da438f0f134d443e7583fa199adf9ad23ea9"} Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.710486 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-87dnb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.714673 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.726717 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kb4km" podStartSLOduration=2.726695247 podStartE2EDuration="2.726695247s" podCreationTimestamp="2025-12-02 07:51:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:51:09.722427631 +0000 UTC m=+317.506506513" watchObservedRunningTime="2025-12-02 07:51:09.726695247 +0000 UTC m=+317.510774109" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.728431 4691 scope.go:117] "RemoveContainer" containerID="8dd7dbb6b386629f42362837e1ccd1f13f2f2de51b389168dc847c9d4ba16374" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.753040 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.763064 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-87dnb"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.765868 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-87dnb"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.766114 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.777473 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.789570 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bdlzb"] Dec 02 07:51:09 crc kubenswrapper[4691]: I1202 07:51:09.986529 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-645d6d6c79-rgjxb"] Dec 02 07:51:09 crc kubenswrapper[4691]: W1202 07:51:09.998048 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd58184b_4da9_4d3f_9aa7_f19d28a1a3e7.slice/crio-6110a0ffb55ac552402d34a186ee935b0cc6fec03a54e1619932632d0ad6f6e4 WatchSource:0}: Error finding container 6110a0ffb55ac552402d34a186ee935b0cc6fec03a54e1619932632d0ad6f6e4: Status 404 returned error can't find the container with id 6110a0ffb55ac552402d34a186ee935b0cc6fec03a54e1619932632d0ad6f6e4 Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.051372 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw"] Dec 02 07:51:10 crc kubenswrapper[4691]: W1202 07:51:10.054854 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b5bbd65_e255_4795_91b8_2bbbaf130f24.slice/crio-fb207815d8a290ab16b76823924eca1f6acc8280bff28ee0cdcf9e5893d0878f WatchSource:0}: Error finding container fb207815d8a290ab16b76823924eca1f6acc8280bff28ee0cdcf9e5893d0878f: Status 404 returned error can't find the container with id fb207815d8a290ab16b76823924eca1f6acc8280bff28ee0cdcf9e5893d0878f Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.571395 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea5aa03a-f69e-4e94-8586-de42593bce47" path="/var/lib/kubelet/pods/ea5aa03a-f69e-4e94-8586-de42593bce47/volumes" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.572663 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb7b09b1-f092-4e2b-8b07-e03343753503" path="/var/lib/kubelet/pods/eb7b09b1-f092-4e2b-8b07-e03343753503/volumes" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.716767 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" event={"ID":"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7","Type":"ContainerStarted","Data":"7436d9cc6cdf37f363f45f971cdd60d0f1d72778a0d54767ea2b98699f3cb11c"} Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.716826 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.716837 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" event={"ID":"bd58184b-4da9-4d3f-9aa7-f19d28a1a3e7","Type":"ContainerStarted","Data":"6110a0ffb55ac552402d34a186ee935b0cc6fec03a54e1619932632d0ad6f6e4"} Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.719835 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" event={"ID":"9b5bbd65-e255-4795-91b8-2bbbaf130f24","Type":"ContainerStarted","Data":"5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30"} Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.719961 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" event={"ID":"9b5bbd65-e255-4795-91b8-2bbbaf130f24","Type":"ContainerStarted","Data":"fb207815d8a290ab16b76823924eca1f6acc8280bff28ee0cdcf9e5893d0878f"} Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.720697 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.721875 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.727483 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.766829 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" podStartSLOduration=2.7668134110000002 podStartE2EDuration="2.766813411s" podCreationTimestamp="2025-12-02 07:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:51:10.765128889 +0000 UTC m=+318.549207751" watchObservedRunningTime="2025-12-02 07:51:10.766813411 +0000 UTC m=+318.550892273" Dec 02 07:51:10 crc kubenswrapper[4691]: I1202 07:51:10.767666 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-645d6d6c79-rgjxb" podStartSLOduration=2.767660062 podStartE2EDuration="2.767660062s" podCreationTimestamp="2025-12-02 07:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:51:10.749996544 +0000 UTC m=+318.534075406" watchObservedRunningTime="2025-12-02 07:51:10.767660062 +0000 UTC m=+318.551738914" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.642791 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-f7x42"] Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.644101 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.646710 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.657910 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7x42"] Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.750792 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26a045c-658e-4950-9d31-98fcc7405794-utilities\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.750840 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26a045c-658e-4950-9d31-98fcc7405794-catalog-content\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.750871 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb9qj\" (UniqueName: \"kubernetes.io/projected/d26a045c-658e-4950-9d31-98fcc7405794-kube-api-access-rb9qj\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.842836 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8hdfj"] Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.843808 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.845966 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.852855 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26a045c-658e-4950-9d31-98fcc7405794-utilities\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.853017 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d26a045c-658e-4950-9d31-98fcc7405794-utilities\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.853643 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26a045c-658e-4950-9d31-98fcc7405794-catalog-content\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.854178 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d26a045c-658e-4950-9d31-98fcc7405794-catalog-content\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.854731 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb9qj\" (UniqueName: \"kubernetes.io/projected/d26a045c-658e-4950-9d31-98fcc7405794-kube-api-access-rb9qj\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.856910 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hdfj"] Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.898043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb9qj\" (UniqueName: \"kubernetes.io/projected/d26a045c-658e-4950-9d31-98fcc7405794-kube-api-access-rb9qj\") pod \"redhat-marketplace-f7x42\" (UID: \"d26a045c-658e-4950-9d31-98fcc7405794\") " pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.958557 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee931362-9b98-4d81-b928-7f9bc9810dea-catalog-content\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.958610 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bfc\" (UniqueName: \"kubernetes.io/projected/ee931362-9b98-4d81-b928-7f9bc9810dea-kube-api-access-g9bfc\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.958781 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee931362-9b98-4d81-b928-7f9bc9810dea-utilities\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:11 crc kubenswrapper[4691]: I1202 07:51:11.966461 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.060717 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bfc\" (UniqueName: \"kubernetes.io/projected/ee931362-9b98-4d81-b928-7f9bc9810dea-kube-api-access-g9bfc\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.061121 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee931362-9b98-4d81-b928-7f9bc9810dea-utilities\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.061233 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee931362-9b98-4d81-b928-7f9bc9810dea-catalog-content\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.061548 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee931362-9b98-4d81-b928-7f9bc9810dea-utilities\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.062942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee931362-9b98-4d81-b928-7f9bc9810dea-catalog-content\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.088018 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bfc\" (UniqueName: \"kubernetes.io/projected/ee931362-9b98-4d81-b928-7f9bc9810dea-kube-api-access-g9bfc\") pod \"redhat-operators-8hdfj\" (UID: \"ee931362-9b98-4d81-b928-7f9bc9810dea\") " pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.194239 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.367933 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-f7x42"] Dec 02 07:51:12 crc kubenswrapper[4691]: W1202 07:51:12.370834 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd26a045c_658e_4950_9d31_98fcc7405794.slice/crio-fc162b7dc2ff3502554479c8f2d35913077d3e1cfce8f02716a9717f56c23500 WatchSource:0}: Error finding container fc162b7dc2ff3502554479c8f2d35913077d3e1cfce8f02716a9717f56c23500: Status 404 returned error can't find the container with id fc162b7dc2ff3502554479c8f2d35913077d3e1cfce8f02716a9717f56c23500 Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.586467 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8hdfj"] Dec 02 07:51:12 crc kubenswrapper[4691]: W1202 07:51:12.596152 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee931362_9b98_4d81_b928_7f9bc9810dea.slice/crio-d7df10f8ce4ce4f864d2767361684b13d976123e7e6e265fd0d2efb08b4692e2 WatchSource:0}: Error finding container d7df10f8ce4ce4f864d2767361684b13d976123e7e6e265fd0d2efb08b4692e2: Status 404 returned error can't find the container with id d7df10f8ce4ce4f864d2767361684b13d976123e7e6e265fd0d2efb08b4692e2 Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.734626 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hdfj" event={"ID":"ee931362-9b98-4d81-b928-7f9bc9810dea","Type":"ContainerStarted","Data":"d7df10f8ce4ce4f864d2767361684b13d976123e7e6e265fd0d2efb08b4692e2"} Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.736787 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7x42" event={"ID":"d26a045c-658e-4950-9d31-98fcc7405794","Type":"ContainerStarted","Data":"5e526190fdb832ace384d7702cf155219e79b99616023d9089673ed5ba8e29f4"} Dec 02 07:51:12 crc kubenswrapper[4691]: I1202 07:51:12.736872 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7x42" event={"ID":"d26a045c-658e-4950-9d31-98fcc7405794","Type":"ContainerStarted","Data":"fc162b7dc2ff3502554479c8f2d35913077d3e1cfce8f02716a9717f56c23500"} Dec 02 07:51:13 crc kubenswrapper[4691]: I1202 07:51:13.743700 4691 generic.go:334] "Generic (PLEG): container finished" podID="ee931362-9b98-4d81-b928-7f9bc9810dea" containerID="4982636ad5cf80b4c759461401716a7e668c7c363519d7a38ad0f808bf09cf0e" exitCode=0 Dec 02 07:51:13 crc kubenswrapper[4691]: I1202 07:51:13.743812 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hdfj" event={"ID":"ee931362-9b98-4d81-b928-7f9bc9810dea","Type":"ContainerDied","Data":"4982636ad5cf80b4c759461401716a7e668c7c363519d7a38ad0f808bf09cf0e"} Dec 02 07:51:13 crc kubenswrapper[4691]: I1202 07:51:13.749443 4691 generic.go:334] "Generic (PLEG): container finished" podID="d26a045c-658e-4950-9d31-98fcc7405794" containerID="5e526190fdb832ace384d7702cf155219e79b99616023d9089673ed5ba8e29f4" exitCode=0 Dec 02 07:51:13 crc kubenswrapper[4691]: I1202 07:51:13.749487 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7x42" event={"ID":"d26a045c-658e-4950-9d31-98fcc7405794","Type":"ContainerDied","Data":"5e526190fdb832ace384d7702cf155219e79b99616023d9089673ed5ba8e29f4"} Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.041624 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k929f"] Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.042719 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.044729 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.054905 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k929f"] Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.197473 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62dqv\" (UniqueName: \"kubernetes.io/projected/c31c7ca5-195b-41cd-9dee-849169e0fc79-kube-api-access-62dqv\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.197554 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c31c7ca5-195b-41cd-9dee-849169e0fc79-utilities\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.197575 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c31c7ca5-195b-41cd-9dee-849169e0fc79-catalog-content\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.243038 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g6czb"] Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.244379 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.250197 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.259877 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g6czb"] Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.299400 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62dqv\" (UniqueName: \"kubernetes.io/projected/c31c7ca5-195b-41cd-9dee-849169e0fc79-kube-api-access-62dqv\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.299486 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c31c7ca5-195b-41cd-9dee-849169e0fc79-utilities\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.299544 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c31c7ca5-195b-41cd-9dee-849169e0fc79-catalog-content\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.300191 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c31c7ca5-195b-41cd-9dee-849169e0fc79-utilities\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.300215 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c31c7ca5-195b-41cd-9dee-849169e0fc79-catalog-content\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.321936 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62dqv\" (UniqueName: \"kubernetes.io/projected/c31c7ca5-195b-41cd-9dee-849169e0fc79-kube-api-access-62dqv\") pod \"community-operators-k929f\" (UID: \"c31c7ca5-195b-41cd-9dee-849169e0fc79\") " pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.394623 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.400686 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96x7n\" (UniqueName: \"kubernetes.io/projected/18fe1559-e4cf-4738-ba81-28146b21a37a-kube-api-access-96x7n\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.400740 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18fe1559-e4cf-4738-ba81-28146b21a37a-utilities\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.400790 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18fe1559-e4cf-4738-ba81-28146b21a37a-catalog-content\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.505727 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18fe1559-e4cf-4738-ba81-28146b21a37a-utilities\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.506108 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18fe1559-e4cf-4738-ba81-28146b21a37a-catalog-content\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.506165 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/18fe1559-e4cf-4738-ba81-28146b21a37a-utilities\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.506203 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96x7n\" (UniqueName: \"kubernetes.io/projected/18fe1559-e4cf-4738-ba81-28146b21a37a-kube-api-access-96x7n\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.506476 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/18fe1559-e4cf-4738-ba81-28146b21a37a-catalog-content\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.535066 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96x7n\" (UniqueName: \"kubernetes.io/projected/18fe1559-e4cf-4738-ba81-28146b21a37a-kube-api-access-96x7n\") pod \"certified-operators-g6czb\" (UID: \"18fe1559-e4cf-4738-ba81-28146b21a37a\") " pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.564995 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.760156 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hdfj" event={"ID":"ee931362-9b98-4d81-b928-7f9bc9810dea","Type":"ContainerStarted","Data":"ff2a0b00aa3fb0046f0577046c38ce5db1dac7ee914f5b0214e070ed033c0747"} Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.761010 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g6czb"] Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.763508 4691 generic.go:334] "Generic (PLEG): container finished" podID="d26a045c-658e-4950-9d31-98fcc7405794" containerID="efe488730722915fbf748d77594bb4e2a468f5987ca7f2e6ab0f6c71448a9e88" exitCode=0 Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.763540 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7x42" event={"ID":"d26a045c-658e-4950-9d31-98fcc7405794","Type":"ContainerDied","Data":"efe488730722915fbf748d77594bb4e2a468f5987ca7f2e6ab0f6c71448a9e88"} Dec 02 07:51:14 crc kubenswrapper[4691]: W1202 07:51:14.788835 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18fe1559_e4cf_4738_ba81_28146b21a37a.slice/crio-ebb1c99d1a938e7ecc34de1f3082cfc55c52a4eb57117412665f6a8a98efc509 WatchSource:0}: Error finding container ebb1c99d1a938e7ecc34de1f3082cfc55c52a4eb57117412665f6a8a98efc509: Status 404 returned error can't find the container with id ebb1c99d1a938e7ecc34de1f3082cfc55c52a4eb57117412665f6a8a98efc509 Dec 02 07:51:14 crc kubenswrapper[4691]: I1202 07:51:14.839025 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k929f"] Dec 02 07:51:14 crc kubenswrapper[4691]: W1202 07:51:14.844203 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31c7ca5_195b_41cd_9dee_849169e0fc79.slice/crio-b02ef715de44c39b43759c1ae9d20c0aedba55a06da9daf68bd266ea903fab3c WatchSource:0}: Error finding container b02ef715de44c39b43759c1ae9d20c0aedba55a06da9daf68bd266ea903fab3c: Status 404 returned error can't find the container with id b02ef715de44c39b43759c1ae9d20c0aedba55a06da9daf68bd266ea903fab3c Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.783689 4691 generic.go:334] "Generic (PLEG): container finished" podID="ee931362-9b98-4d81-b928-7f9bc9810dea" containerID="ff2a0b00aa3fb0046f0577046c38ce5db1dac7ee914f5b0214e070ed033c0747" exitCode=0 Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.784147 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hdfj" event={"ID":"ee931362-9b98-4d81-b928-7f9bc9810dea","Type":"ContainerDied","Data":"ff2a0b00aa3fb0046f0577046c38ce5db1dac7ee914f5b0214e070ed033c0747"} Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.787732 4691 generic.go:334] "Generic (PLEG): container finished" podID="c31c7ca5-195b-41cd-9dee-849169e0fc79" containerID="3753756c1a84b834062472a964922e22261b1a397322601e8aeedaf147ef7a51" exitCode=0 Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.787808 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k929f" event={"ID":"c31c7ca5-195b-41cd-9dee-849169e0fc79","Type":"ContainerDied","Data":"3753756c1a84b834062472a964922e22261b1a397322601e8aeedaf147ef7a51"} Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.787831 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k929f" event={"ID":"c31c7ca5-195b-41cd-9dee-849169e0fc79","Type":"ContainerStarted","Data":"b02ef715de44c39b43759c1ae9d20c0aedba55a06da9daf68bd266ea903fab3c"} Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.790120 4691 generic.go:334] "Generic (PLEG): container finished" podID="18fe1559-e4cf-4738-ba81-28146b21a37a" containerID="c9a6bd4cf2a80024ff2e0aca634625ab160bd10569127105d0d9069fc5125b41" exitCode=0 Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.790151 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6czb" event={"ID":"18fe1559-e4cf-4738-ba81-28146b21a37a","Type":"ContainerDied","Data":"c9a6bd4cf2a80024ff2e0aca634625ab160bd10569127105d0d9069fc5125b41"} Dec 02 07:51:15 crc kubenswrapper[4691]: I1202 07:51:15.790171 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6czb" event={"ID":"18fe1559-e4cf-4738-ba81-28146b21a37a","Type":"ContainerStarted","Data":"ebb1c99d1a938e7ecc34de1f3082cfc55c52a4eb57117412665f6a8a98efc509"} Dec 02 07:51:16 crc kubenswrapper[4691]: I1202 07:51:16.798438 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-f7x42" event={"ID":"d26a045c-658e-4950-9d31-98fcc7405794","Type":"ContainerStarted","Data":"19ba9a4aa5576600b8eb59e706fa81cd82481999f2086e74f24896765e8216f7"} Dec 02 07:51:16 crc kubenswrapper[4691]: I1202 07:51:16.817357 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-f7x42" podStartSLOduration=3.753706684 podStartE2EDuration="5.817339593s" podCreationTimestamp="2025-12-02 07:51:11 +0000 UTC" firstStartedPulling="2025-12-02 07:51:13.750843124 +0000 UTC m=+321.534921986" lastFinishedPulling="2025-12-02 07:51:15.814476033 +0000 UTC m=+323.598554895" observedRunningTime="2025-12-02 07:51:16.817115758 +0000 UTC m=+324.601194640" watchObservedRunningTime="2025-12-02 07:51:16.817339593 +0000 UTC m=+324.601418455" Dec 02 07:51:17 crc kubenswrapper[4691]: I1202 07:51:17.807198 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8hdfj" event={"ID":"ee931362-9b98-4d81-b928-7f9bc9810dea","Type":"ContainerStarted","Data":"7662fa894b7fdfd9dce9749d8cffd7fa9f3eed702689929a8527aa622851e3c1"} Dec 02 07:51:17 crc kubenswrapper[4691]: I1202 07:51:17.809384 4691 generic.go:334] "Generic (PLEG): container finished" podID="c31c7ca5-195b-41cd-9dee-849169e0fc79" containerID="cd886934aff233bc1a3c8a7d9da6b0d342879c686d910e5bdac3485f50ed1d7f" exitCode=0 Dec 02 07:51:17 crc kubenswrapper[4691]: I1202 07:51:17.809459 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k929f" event={"ID":"c31c7ca5-195b-41cd-9dee-849169e0fc79","Type":"ContainerDied","Data":"cd886934aff233bc1a3c8a7d9da6b0d342879c686d910e5bdac3485f50ed1d7f"} Dec 02 07:51:17 crc kubenswrapper[4691]: I1202 07:51:17.811191 4691 generic.go:334] "Generic (PLEG): container finished" podID="18fe1559-e4cf-4738-ba81-28146b21a37a" containerID="4bdd0afab95b5ea0aeea7ec53ec0daf7ab9b811a08edb6809437e5ce50fd7717" exitCode=0 Dec 02 07:51:17 crc kubenswrapper[4691]: I1202 07:51:17.811275 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6czb" event={"ID":"18fe1559-e4cf-4738-ba81-28146b21a37a","Type":"ContainerDied","Data":"4bdd0afab95b5ea0aeea7ec53ec0daf7ab9b811a08edb6809437e5ce50fd7717"} Dec 02 07:51:17 crc kubenswrapper[4691]: I1202 07:51:17.827476 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8hdfj" podStartSLOduration=3.833967006 podStartE2EDuration="6.827459443s" podCreationTimestamp="2025-12-02 07:51:11 +0000 UTC" firstStartedPulling="2025-12-02 07:51:13.746334713 +0000 UTC m=+321.530413575" lastFinishedPulling="2025-12-02 07:51:16.73982714 +0000 UTC m=+324.523906012" observedRunningTime="2025-12-02 07:51:17.825405902 +0000 UTC m=+325.609484764" watchObservedRunningTime="2025-12-02 07:51:17.827459443 +0000 UTC m=+325.611538305" Dec 02 07:51:18 crc kubenswrapper[4691]: I1202 07:51:18.819968 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k929f" event={"ID":"c31c7ca5-195b-41cd-9dee-849169e0fc79","Type":"ContainerStarted","Data":"0edaae21fc13d2f2247e424958fc12ebd5ab4fcbd0028c291fa9db54b022abbc"} Dec 02 07:51:18 crc kubenswrapper[4691]: I1202 07:51:18.836560 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k929f" podStartSLOduration=2.144280698 podStartE2EDuration="4.836538209s" podCreationTimestamp="2025-12-02 07:51:14 +0000 UTC" firstStartedPulling="2025-12-02 07:51:15.807576841 +0000 UTC m=+323.591655703" lastFinishedPulling="2025-12-02 07:51:18.499834342 +0000 UTC m=+326.283913214" observedRunningTime="2025-12-02 07:51:18.834382335 +0000 UTC m=+326.618461197" watchObservedRunningTime="2025-12-02 07:51:18.836538209 +0000 UTC m=+326.620617081" Dec 02 07:51:19 crc kubenswrapper[4691]: I1202 07:51:19.827148 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g6czb" event={"ID":"18fe1559-e4cf-4738-ba81-28146b21a37a","Type":"ContainerStarted","Data":"5f818a9bbd5167b7c13e34fa6c7256158e6923852976787d983065fdb6bc0674"} Dec 02 07:51:19 crc kubenswrapper[4691]: I1202 07:51:19.845918 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g6czb" podStartSLOduration=2.657070886 podStartE2EDuration="5.845900821s" podCreationTimestamp="2025-12-02 07:51:14 +0000 UTC" firstStartedPulling="2025-12-02 07:51:15.808191447 +0000 UTC m=+323.592270309" lastFinishedPulling="2025-12-02 07:51:18.997021382 +0000 UTC m=+326.781100244" observedRunningTime="2025-12-02 07:51:19.841085893 +0000 UTC m=+327.625164765" watchObservedRunningTime="2025-12-02 07:51:19.845900821 +0000 UTC m=+327.629979683" Dec 02 07:51:21 crc kubenswrapper[4691]: I1202 07:51:21.966992 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:21 crc kubenswrapper[4691]: I1202 07:51:21.967051 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:22 crc kubenswrapper[4691]: I1202 07:51:22.004238 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:22 crc kubenswrapper[4691]: I1202 07:51:22.195447 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:22 crc kubenswrapper[4691]: I1202 07:51:22.195493 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:22 crc kubenswrapper[4691]: I1202 07:51:22.232644 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:22 crc kubenswrapper[4691]: I1202 07:51:22.882210 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8hdfj" Dec 02 07:51:22 crc kubenswrapper[4691]: I1202 07:51:22.887098 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-f7x42" Dec 02 07:51:23 crc kubenswrapper[4691]: I1202 07:51:23.295673 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw"] Dec 02 07:51:23 crc kubenswrapper[4691]: I1202 07:51:23.295906 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" podUID="9b5bbd65-e255-4795-91b8-2bbbaf130f24" containerName="route-controller-manager" containerID="cri-o://5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30" gracePeriod=30 Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.395661 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.395724 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.442681 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.569488 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.569866 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.607963 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.827128 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.851775 4691 generic.go:334] "Generic (PLEG): container finished" podID="9b5bbd65-e255-4795-91b8-2bbbaf130f24" containerID="5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30" exitCode=0 Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.851870 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.851914 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" event={"ID":"9b5bbd65-e255-4795-91b8-2bbbaf130f24","Type":"ContainerDied","Data":"5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30"} Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.851950 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw" event={"ID":"9b5bbd65-e255-4795-91b8-2bbbaf130f24","Type":"ContainerDied","Data":"fb207815d8a290ab16b76823924eca1f6acc8280bff28ee0cdcf9e5893d0878f"} Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.851966 4691 scope.go:117] "RemoveContainer" containerID="5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.868651 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm"] Dec 02 07:51:24 crc kubenswrapper[4691]: E1202 07:51:24.868960 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b5bbd65-e255-4795-91b8-2bbbaf130f24" containerName="route-controller-manager" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.868984 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b5bbd65-e255-4795-91b8-2bbbaf130f24" containerName="route-controller-manager" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.869096 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b5bbd65-e255-4795-91b8-2bbbaf130f24" containerName="route-controller-manager" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.869596 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.874926 4691 scope.go:117] "RemoveContainer" containerID="5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30" Dec 02 07:51:24 crc kubenswrapper[4691]: E1202 07:51:24.875598 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30\": container with ID starting with 5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30 not found: ID does not exist" containerID="5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.875630 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30"} err="failed to get container status \"5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30\": rpc error: code = NotFound desc = could not find container \"5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30\": container with ID starting with 5ba6b3e47b348ede8492da2864614ece4e8f5999c8c2b9c45a26f44731827a30 not found: ID does not exist" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.885176 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm"] Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.897989 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k929f" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.902127 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g6czb" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.945615 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zch5d\" (UniqueName: \"kubernetes.io/projected/9b5bbd65-e255-4795-91b8-2bbbaf130f24-kube-api-access-zch5d\") pod \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.945697 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-client-ca\") pod \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.945892 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b5bbd65-e255-4795-91b8-2bbbaf130f24-serving-cert\") pod \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.945983 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-config\") pod \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\" (UID: \"9b5bbd65-e255-4795-91b8-2bbbaf130f24\") " Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.946250 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1379450d-6a82-48e7-83ed-82676d8c8174-serving-cert\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.946404 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flcng\" (UniqueName: \"kubernetes.io/projected/1379450d-6a82-48e7-83ed-82676d8c8174-kube-api-access-flcng\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.946474 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1379450d-6a82-48e7-83ed-82676d8c8174-client-ca\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.946524 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1379450d-6a82-48e7-83ed-82676d8c8174-config\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.946708 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b5bbd65-e255-4795-91b8-2bbbaf130f24" (UID: "9b5bbd65-e255-4795-91b8-2bbbaf130f24"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.948715 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-config" (OuterVolumeSpecName: "config") pod "9b5bbd65-e255-4795-91b8-2bbbaf130f24" (UID: "9b5bbd65-e255-4795-91b8-2bbbaf130f24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.951149 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b5bbd65-e255-4795-91b8-2bbbaf130f24-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b5bbd65-e255-4795-91b8-2bbbaf130f24" (UID: "9b5bbd65-e255-4795-91b8-2bbbaf130f24"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:51:24 crc kubenswrapper[4691]: I1202 07:51:24.967651 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b5bbd65-e255-4795-91b8-2bbbaf130f24-kube-api-access-zch5d" (OuterVolumeSpecName: "kube-api-access-zch5d") pod "9b5bbd65-e255-4795-91b8-2bbbaf130f24" (UID: "9b5bbd65-e255-4795-91b8-2bbbaf130f24"). InnerVolumeSpecName "kube-api-access-zch5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.047767 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flcng\" (UniqueName: \"kubernetes.io/projected/1379450d-6a82-48e7-83ed-82676d8c8174-kube-api-access-flcng\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.047839 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1379450d-6a82-48e7-83ed-82676d8c8174-client-ca\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.047892 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1379450d-6a82-48e7-83ed-82676d8c8174-config\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.047928 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1379450d-6a82-48e7-83ed-82676d8c8174-serving-cert\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.047968 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.047984 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zch5d\" (UniqueName: \"kubernetes.io/projected/9b5bbd65-e255-4795-91b8-2bbbaf130f24-kube-api-access-zch5d\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.048015 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b5bbd65-e255-4795-91b8-2bbbaf130f24-client-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.048027 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b5bbd65-e255-4795-91b8-2bbbaf130f24-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.048997 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1379450d-6a82-48e7-83ed-82676d8c8174-client-ca\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.049822 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1379450d-6a82-48e7-83ed-82676d8c8174-config\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.051329 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1379450d-6a82-48e7-83ed-82676d8c8174-serving-cert\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.062468 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flcng\" (UniqueName: \"kubernetes.io/projected/1379450d-6a82-48e7-83ed-82676d8c8174-kube-api-access-flcng\") pod \"route-controller-manager-f5fd89ccc-cs2nm\" (UID: \"1379450d-6a82-48e7-83ed-82676d8c8174\") " pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.181668 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw"] Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.186172 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f94579cc-gw5rw"] Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.193428 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.587685 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm"] Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.858464 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" event={"ID":"1379450d-6a82-48e7-83ed-82676d8c8174","Type":"ContainerStarted","Data":"58b2d3687fc4bb2068427d580cad5e414cbae31a0c116e9dd625bb6627e44406"} Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.858506 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" event={"ID":"1379450d-6a82-48e7-83ed-82676d8c8174","Type":"ContainerStarted","Data":"eca2665d10517eeaa5f0d656212593d764910ab97e43ffe872a757eb3916a02d"} Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.861141 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:25 crc kubenswrapper[4691]: I1202 07:51:25.883511 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" podStartSLOduration=2.883490941 podStartE2EDuration="2.883490941s" podCreationTimestamp="2025-12-02 07:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:51:25.879478133 +0000 UTC m=+333.663557005" watchObservedRunningTime="2025-12-02 07:51:25.883490941 +0000 UTC m=+333.667569813" Dec 02 07:51:26 crc kubenswrapper[4691]: I1202 07:51:26.193853 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f5fd89ccc-cs2nm" Dec 02 07:51:26 crc kubenswrapper[4691]: I1202 07:51:26.568620 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b5bbd65-e255-4795-91b8-2bbbaf130f24" path="/var/lib/kubelet/pods/9b5bbd65-e255-4795-91b8-2bbbaf130f24/volumes" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.531127 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f6pgf"] Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.533682 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.546029 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f6pgf"] Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636677 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a963fc03-8869-40bd-a7f7-0b8b06f32468-registry-certificates\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636720 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-registry-tls\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636743 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-bound-sa-token\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636764 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a963fc03-8869-40bd-a7f7-0b8b06f32468-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636843 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9lf5\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-kube-api-access-f9lf5\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636860 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a963fc03-8869-40bd-a7f7-0b8b06f32468-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636918 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a963fc03-8869-40bd-a7f7-0b8b06f32468-trusted-ca\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.636954 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.663784 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.737827 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a963fc03-8869-40bd-a7f7-0b8b06f32468-trusted-ca\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.737911 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a963fc03-8869-40bd-a7f7-0b8b06f32468-registry-certificates\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.737943 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-registry-tls\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.737972 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-bound-sa-token\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.737997 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a963fc03-8869-40bd-a7f7-0b8b06f32468-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.738029 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9lf5\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-kube-api-access-f9lf5\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.738056 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a963fc03-8869-40bd-a7f7-0b8b06f32468-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.738943 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a963fc03-8869-40bd-a7f7-0b8b06f32468-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.739440 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a963fc03-8869-40bd-a7f7-0b8b06f32468-trusted-ca\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.739623 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a963fc03-8869-40bd-a7f7-0b8b06f32468-registry-certificates\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.743649 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a963fc03-8869-40bd-a7f7-0b8b06f32468-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.743747 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-registry-tls\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.754298 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-bound-sa-token\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.754744 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9lf5\" (UniqueName: \"kubernetes.io/projected/a963fc03-8869-40bd-a7f7-0b8b06f32468-kube-api-access-f9lf5\") pod \"image-registry-66df7c8f76-f6pgf\" (UID: \"a963fc03-8869-40bd-a7f7-0b8b06f32468\") " pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:48 crc kubenswrapper[4691]: I1202 07:51:48.853595 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:49 crc kubenswrapper[4691]: I1202 07:51:49.325285 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f6pgf"] Dec 02 07:51:49 crc kubenswrapper[4691]: I1202 07:51:49.497599 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" event={"ID":"a963fc03-8869-40bd-a7f7-0b8b06f32468","Type":"ContainerStarted","Data":"38601379564c04ae6d84a88408278ef01efceeb9442cf61a00b30fe08cdb27f0"} Dec 02 07:51:49 crc kubenswrapper[4691]: I1202 07:51:49.497951 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:51:49 crc kubenswrapper[4691]: I1202 07:51:49.497969 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" event={"ID":"a963fc03-8869-40bd-a7f7-0b8b06f32468","Type":"ContainerStarted","Data":"f2e47d495fabd111c533375d66f2f065446b6346dbc8b9181ca0b22c3b4533e4"} Dec 02 07:51:49 crc kubenswrapper[4691]: I1202 07:51:49.518282 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" podStartSLOduration=1.518245246 podStartE2EDuration="1.518245246s" podCreationTimestamp="2025-12-02 07:51:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:51:49.513576551 +0000 UTC m=+357.297655433" watchObservedRunningTime="2025-12-02 07:51:49.518245246 +0000 UTC m=+357.302324108" Dec 02 07:51:51 crc kubenswrapper[4691]: I1202 07:51:51.898755 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:51:51 crc kubenswrapper[4691]: I1202 07:51:51.899124 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:52:08 crc kubenswrapper[4691]: I1202 07:52:08.860471 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f6pgf" Dec 02 07:52:08 crc kubenswrapper[4691]: I1202 07:52:08.917470 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kq8jr"] Dec 02 07:52:21 crc kubenswrapper[4691]: I1202 07:52:21.899125 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:52:21 crc kubenswrapper[4691]: I1202 07:52:21.899707 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:52:33 crc kubenswrapper[4691]: I1202 07:52:33.974739 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" podUID="53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" containerName="registry" containerID="cri-o://bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c" gracePeriod=30 Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.329772 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394065 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfzrp\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-kube-api-access-pfzrp\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394231 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394277 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-installation-pull-secrets\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394321 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-certificates\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394364 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-trusted-ca\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394388 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-tls\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394404 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-bound-sa-token\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.394427 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-ca-trust-extracted\") pod \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\" (UID: \"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec\") " Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.395472 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.395828 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.403245 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.406944 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-kube-api-access-pfzrp" (OuterVolumeSpecName: "kube-api-access-pfzrp") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "kube-api-access-pfzrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.407086 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.407419 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.407571 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.425256 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" (UID: "53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495478 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495517 4691 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495527 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495539 4691 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495548 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfzrp\" (UniqueName: \"kubernetes.io/projected/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-kube-api-access-pfzrp\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495557 4691 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.495565 4691 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.744012 4691 generic.go:334] "Generic (PLEG): container finished" podID="53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" containerID="bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c" exitCode=0 Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.744058 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" event={"ID":"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec","Type":"ContainerDied","Data":"bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c"} Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.744106 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" event={"ID":"53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec","Type":"ContainerDied","Data":"c04ed303f172948be9c4b6f940c935baac2667029a854ad418190686c4a95b97"} Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.744130 4691 scope.go:117] "RemoveContainer" containerID="bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.744291 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kq8jr" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.761019 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kq8jr"] Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.764878 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kq8jr"] Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.777654 4691 scope.go:117] "RemoveContainer" containerID="bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c" Dec 02 07:52:34 crc kubenswrapper[4691]: E1202 07:52:34.778194 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c\": container with ID starting with bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c not found: ID does not exist" containerID="bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c" Dec 02 07:52:34 crc kubenswrapper[4691]: I1202 07:52:34.778319 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c"} err="failed to get container status \"bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c\": rpc error: code = NotFound desc = could not find container \"bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c\": container with ID starting with bddae39f95fb64362c16b84a19b4c545c66b62022688cdb3fdb2704e24f7189c not found: ID does not exist" Dec 02 07:52:36 crc kubenswrapper[4691]: I1202 07:52:36.574121 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" path="/var/lib/kubelet/pods/53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec/volumes" Dec 02 07:52:51 crc kubenswrapper[4691]: I1202 07:52:51.899307 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:52:51 crc kubenswrapper[4691]: I1202 07:52:51.899937 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:52:51 crc kubenswrapper[4691]: I1202 07:52:51.899990 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:52:51 crc kubenswrapper[4691]: I1202 07:52:51.900694 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bb78041367ae6920b31cd251da35411d957791f1be4b05d33750afac0123755e"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:52:51 crc kubenswrapper[4691]: I1202 07:52:51.900857 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://bb78041367ae6920b31cd251da35411d957791f1be4b05d33750afac0123755e" gracePeriod=600 Dec 02 07:52:52 crc kubenswrapper[4691]: I1202 07:52:52.853115 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="bb78041367ae6920b31cd251da35411d957791f1be4b05d33750afac0123755e" exitCode=0 Dec 02 07:52:52 crc kubenswrapper[4691]: I1202 07:52:52.853220 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"bb78041367ae6920b31cd251da35411d957791f1be4b05d33750afac0123755e"} Dec 02 07:52:52 crc kubenswrapper[4691]: I1202 07:52:52.853634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"a16996ecf2c98a339a05624da8a98affe6240ab365ac144c76fc73906ae11b70"} Dec 02 07:52:52 crc kubenswrapper[4691]: I1202 07:52:52.853662 4691 scope.go:117] "RemoveContainer" containerID="f04c4a878b87035f6c1c15ca995b43d46c2033792ef2d7ab13e141adb126a873" Dec 02 07:55:21 crc kubenswrapper[4691]: I1202 07:55:21.899001 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:55:21 crc kubenswrapper[4691]: I1202 07:55:21.899957 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:55:51 crc kubenswrapper[4691]: I1202 07:55:51.899085 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:55:51 crc kubenswrapper[4691]: I1202 07:55:51.900175 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.229699 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-bjslb"] Dec 02 07:56:05 crc kubenswrapper[4691]: E1202 07:56:05.231288 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" containerName="registry" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.231321 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" containerName="registry" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.231527 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="53a4d8e7-4755-4a5a-9728-8d9d3b8f2cec" containerName="registry" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.235006 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.236997 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.237045 4691 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-9p672" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.237974 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.251460 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5gp8z"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.252643 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.254785 4691 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8lb64" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.257381 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-bjslb"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.261199 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-6jm2z"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.262268 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-6jm2z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.264135 4691 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-f6jdj" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.300892 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5gp8z"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.303728 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-6jm2z"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.408473 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8mxm\" (UniqueName: \"kubernetes.io/projected/d29342d1-9924-4226-ae63-6e405a469f70-kube-api-access-x8mxm\") pod \"cert-manager-5b446d88c5-6jm2z\" (UID: \"d29342d1-9924-4226-ae63-6e405a469f70\") " pod="cert-manager/cert-manager-5b446d88c5-6jm2z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.408521 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w7l8\" (UniqueName: \"kubernetes.io/projected/e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de-kube-api-access-9w7l8\") pod \"cert-manager-cainjector-7f985d654d-bjslb\" (UID: \"e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.408618 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/ee012121-3006-49dc-9f6d-349cdbc940a1-kube-api-access-fwcsv\") pod \"cert-manager-webhook-5655c58dd6-5gp8z\" (UID: \"ee012121-3006-49dc-9f6d-349cdbc940a1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.510311 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/ee012121-3006-49dc-9f6d-349cdbc940a1-kube-api-access-fwcsv\") pod \"cert-manager-webhook-5655c58dd6-5gp8z\" (UID: \"ee012121-3006-49dc-9f6d-349cdbc940a1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.510369 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8mxm\" (UniqueName: \"kubernetes.io/projected/d29342d1-9924-4226-ae63-6e405a469f70-kube-api-access-x8mxm\") pod \"cert-manager-5b446d88c5-6jm2z\" (UID: \"d29342d1-9924-4226-ae63-6e405a469f70\") " pod="cert-manager/cert-manager-5b446d88c5-6jm2z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.510419 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w7l8\" (UniqueName: \"kubernetes.io/projected/e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de-kube-api-access-9w7l8\") pod \"cert-manager-cainjector-7f985d654d-bjslb\" (UID: \"e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.531145 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwcsv\" (UniqueName: \"kubernetes.io/projected/ee012121-3006-49dc-9f6d-349cdbc940a1-kube-api-access-fwcsv\") pod \"cert-manager-webhook-5655c58dd6-5gp8z\" (UID: \"ee012121-3006-49dc-9f6d-349cdbc940a1\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.535223 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8mxm\" (UniqueName: \"kubernetes.io/projected/d29342d1-9924-4226-ae63-6e405a469f70-kube-api-access-x8mxm\") pod \"cert-manager-5b446d88c5-6jm2z\" (UID: \"d29342d1-9924-4226-ae63-6e405a469f70\") " pod="cert-manager/cert-manager-5b446d88c5-6jm2z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.536690 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w7l8\" (UniqueName: \"kubernetes.io/projected/e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de-kube-api-access-9w7l8\") pod \"cert-manager-cainjector-7f985d654d-bjslb\" (UID: \"e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.553071 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.612421 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.619826 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-6jm2z" Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.766065 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-bjslb"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.788764 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.837873 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-5gp8z"] Dec 02 07:56:05 crc kubenswrapper[4691]: I1202 07:56:05.885571 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-6jm2z"] Dec 02 07:56:05 crc kubenswrapper[4691]: W1202 07:56:05.887182 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd29342d1_9924_4226_ae63_6e405a469f70.slice/crio-bd46f7bb49ba1ba6a099c6d283249487a0f7f68d383a1cb960adb226081cc927 WatchSource:0}: Error finding container bd46f7bb49ba1ba6a099c6d283249487a0f7f68d383a1cb960adb226081cc927: Status 404 returned error can't find the container with id bd46f7bb49ba1ba6a099c6d283249487a0f7f68d383a1cb960adb226081cc927 Dec 02 07:56:06 crc kubenswrapper[4691]: I1202 07:56:06.114729 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-6jm2z" event={"ID":"d29342d1-9924-4226-ae63-6e405a469f70","Type":"ContainerStarted","Data":"bd46f7bb49ba1ba6a099c6d283249487a0f7f68d383a1cb960adb226081cc927"} Dec 02 07:56:06 crc kubenswrapper[4691]: I1202 07:56:06.115821 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" event={"ID":"e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de","Type":"ContainerStarted","Data":"2e9b7046cf0bb518ac85bb69d63af23cbf02fa0c33b9aee5e699992c845812eb"} Dec 02 07:56:06 crc kubenswrapper[4691]: I1202 07:56:06.116873 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" event={"ID":"ee012121-3006-49dc-9f6d-349cdbc940a1","Type":"ContainerStarted","Data":"dffdcbd93bc7e23fd2acd2696fde400b3e1a745c66ba60d30052f26242ca549a"} Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.147172 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" event={"ID":"e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de","Type":"ContainerStarted","Data":"9bd5b4d18fd914b9fed5604a7a9af49e204610fb12e729617f56ecefcab69f77"} Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.149024 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" event={"ID":"ee012121-3006-49dc-9f6d-349cdbc940a1","Type":"ContainerStarted","Data":"fa94647c41a0f7aadfda173d54d3ae5415149ce9430b2ca3fa7fa8384afd8c83"} Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.149089 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.151622 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-6jm2z" event={"ID":"d29342d1-9924-4226-ae63-6e405a469f70","Type":"ContainerStarted","Data":"e3d196f4b9ce6846f46fa48a936e41e76a0fb565f994fc86eb0b9d834f17a4a6"} Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.168187 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-bjslb" podStartSLOduration=1.998364641 podStartE2EDuration="5.168161063s" podCreationTimestamp="2025-12-02 07:56:05 +0000 UTC" firstStartedPulling="2025-12-02 07:56:05.78855872 +0000 UTC m=+613.572637582" lastFinishedPulling="2025-12-02 07:56:08.958355142 +0000 UTC m=+616.742434004" observedRunningTime="2025-12-02 07:56:10.163135645 +0000 UTC m=+617.947214507" watchObservedRunningTime="2025-12-02 07:56:10.168161063 +0000 UTC m=+617.952239925" Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.195446 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-6jm2z" podStartSLOduration=2.126355598 podStartE2EDuration="5.195428197s" podCreationTimestamp="2025-12-02 07:56:05 +0000 UTC" firstStartedPulling="2025-12-02 07:56:05.889653153 +0000 UTC m=+613.673732005" lastFinishedPulling="2025-12-02 07:56:08.958725742 +0000 UTC m=+616.742804604" observedRunningTime="2025-12-02 07:56:10.194099463 +0000 UTC m=+617.978178325" watchObservedRunningTime="2025-12-02 07:56:10.195428197 +0000 UTC m=+617.979507059" Dec 02 07:56:10 crc kubenswrapper[4691]: I1202 07:56:10.213890 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" podStartSLOduration=2.027793549 podStartE2EDuration="5.213874576s" podCreationTimestamp="2025-12-02 07:56:05 +0000 UTC" firstStartedPulling="2025-12-02 07:56:05.844605556 +0000 UTC m=+613.628684418" lastFinishedPulling="2025-12-02 07:56:09.030686583 +0000 UTC m=+616.814765445" observedRunningTime="2025-12-02 07:56:10.211177388 +0000 UTC m=+617.995256250" watchObservedRunningTime="2025-12-02 07:56:10.213874576 +0000 UTC m=+617.997953438" Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.615685 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-5gp8z" Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.931989 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pgxh"] Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932680 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="nbdb" containerID="cri-o://46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932726 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-node" containerID="cri-o://64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932815 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-acl-logging" containerID="cri-o://19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932896 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="sbdb" containerID="cri-o://643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932928 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="northd" containerID="cri-o://a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932980 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.932597 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-controller" containerID="cri-o://465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29" gracePeriod=30 Dec 02 07:56:15 crc kubenswrapper[4691]: I1202 07:56:15.967551 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" containerID="cri-o://0a5db1288676acd22e6bdb6f14c182c8bb9c93429b8c8b433d96b27c2b5e544e" gracePeriod=30 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.196212 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovnkube-controller/3.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.200308 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovn-acl-logging/0.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.201481 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovn-controller/0.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202396 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="0a5db1288676acd22e6bdb6f14c182c8bb9c93429b8c8b433d96b27c2b5e544e" exitCode=0 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202431 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4" exitCode=0 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202443 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c" exitCode=0 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202478 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518" exitCode=0 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202490 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0" exitCode=0 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202473 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"0a5db1288676acd22e6bdb6f14c182c8bb9c93429b8c8b433d96b27c2b5e544e"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202554 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202571 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202587 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202594 4691 scope.go:117] "RemoveContainer" containerID="4ffee9841e8cff0302c2380bc8b935740d02e9726316c66ed194bfe5c2de75d4" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202600 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202501 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340" exitCode=0 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202806 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a" exitCode=143 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202828 4691 generic.go:334] "Generic (PLEG): container finished" podID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerID="465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29" exitCode=143 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202805 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202943 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" event={"ID":"3605748c-8980-4aa9-8d28-f18a17aa8124","Type":"ContainerDied","Data":"5d3d1eb7e31f60d5fe2f562867d91bc5039d6333763d3387e87aafd096cc1ede"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.202956 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d3d1eb7e31f60d5fe2f562867d91bc5039d6333763d3387e87aafd096cc1ede" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.205052 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/2.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.205515 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/1.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.205556 4691 generic.go:334] "Generic (PLEG): container finished" podID="eb6171dd-c2ea-4c52-b906-e8a9a7ff6537" containerID="9a3b02e6e070c37eaa327e48115ddfe37fb61e3e7f06d3a121542f798fd2097f" exitCode=2 Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.205587 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerDied","Data":"9a3b02e6e070c37eaa327e48115ddfe37fb61e3e7f06d3a121542f798fd2097f"} Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.206479 4691 scope.go:117] "RemoveContainer" containerID="9a3b02e6e070c37eaa327e48115ddfe37fb61e3e7f06d3a121542f798fd2097f" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.206924 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6gcsh_openshift-multus(eb6171dd-c2ea-4c52-b906-e8a9a7ff6537)\"" pod="openshift-multus/multus-6gcsh" podUID="eb6171dd-c2ea-4c52-b906-e8a9a7ff6537" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.220440 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovn-acl-logging/0.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.220899 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovn-controller/0.log" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.221249 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.224398 4691 scope.go:117] "RemoveContainer" containerID="0b18518bf13d33754e2a8f6985e2da3df9cab1fc54a2e38240a65685c3fb2722" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286396 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rf2wl"] Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286735 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286779 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286794 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="nbdb" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286806 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="nbdb" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286817 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286826 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286836 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286844 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286854 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="northd" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286864 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="northd" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286876 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-node" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286884 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-node" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286896 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kubecfg-setup" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286903 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kubecfg-setup" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286911 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="sbdb" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286920 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="sbdb" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286931 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286940 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286956 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286965 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286974 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.286982 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.286995 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-acl-logging" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287003 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-acl-logging" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287132 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-node" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287148 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-acl-logging" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287161 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287169 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="northd" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287179 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovn-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287191 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287202 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="nbdb" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287212 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287224 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="kube-rbac-proxy-ovn-metrics" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287235 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="sbdb" Dec 02 07:56:16 crc kubenswrapper[4691]: E1202 07:56:16.287355 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287366 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287490 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.287507 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" containerName="ovnkube-controller" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.289665 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371099 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-netd\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371163 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8q22\" (UniqueName: \"kubernetes.io/projected/3605748c-8980-4aa9-8d28-f18a17aa8124-kube-api-access-s8q22\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371206 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371255 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-netns\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371289 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-bin\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371347 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371382 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371705 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371736 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-log-socket\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371862 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-kubelet\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371883 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-etc-openvswitch\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371898 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-slash\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371921 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-openvswitch\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371941 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-systemd\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371956 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-ovn\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371793 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-log-socket" (OuterVolumeSpecName: "log-socket") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371923 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.371975 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-env-overrides\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372029 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372102 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372112 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-config\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372130 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372138 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-ovn-kubernetes\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372158 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-slash" (OuterVolumeSpecName: "host-slash") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372189 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372195 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-node-log\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372214 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-node-log" (OuterVolumeSpecName: "node-log") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372285 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3605748c-8980-4aa9-8d28-f18a17aa8124-ovn-node-metrics-cert\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372323 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-var-lib-openvswitch\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372366 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-script-lib\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372426 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-systemd-units\") pod \"3605748c-8980-4aa9-8d28-f18a17aa8124\" (UID: \"3605748c-8980-4aa9-8d28-f18a17aa8124\") " Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372552 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372586 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372612 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372680 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372686 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-cni-netd\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372738 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-etc-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372748 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372781 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-node-log\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372874 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372925 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-ovnkube-script-lib\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372943 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-var-lib-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.372983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-systemd-units\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373036 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-cni-bin\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373066 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-ovnkube-config\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373121 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-env-overrides\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373152 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373174 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-systemd\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373199 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgsk2\" (UniqueName: \"kubernetes.io/projected/dcea46c0-dde1-4377-95f5-841fd373c386-kube-api-access-kgsk2\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373248 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-log-socket\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373243 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373272 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcea46c0-dde1-4377-95f5-841fd373c386-ovn-node-metrics-cert\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-slash\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373572 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-run-ovn-kubernetes\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373607 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-ovn\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373622 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-kubelet\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373651 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-run-netns\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373731 4691 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373741 4691 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373751 4691 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373780 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373791 4691 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373801 4691 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-node-log\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373810 4691 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373818 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3605748c-8980-4aa9-8d28-f18a17aa8124-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373852 4691 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373861 4691 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373874 4691 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373885 4691 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373895 4691 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373903 4691 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-log-socket\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373912 4691 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373920 4691 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.373929 4691 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-host-slash\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.377884 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3605748c-8980-4aa9-8d28-f18a17aa8124-kube-api-access-s8q22" (OuterVolumeSpecName: "kube-api-access-s8q22") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "kube-api-access-s8q22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.378053 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3605748c-8980-4aa9-8d28-f18a17aa8124-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.385202 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3605748c-8980-4aa9-8d28-f18a17aa8124" (UID: "3605748c-8980-4aa9-8d28-f18a17aa8124"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475104 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-run-netns\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475199 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-cni-netd\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-etc-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475278 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-run-netns\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475387 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-node-log\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475399 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-cni-netd\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475301 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-node-log\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475556 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475454 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-etc-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475616 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-ovnkube-script-lib\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475702 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-var-lib-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475856 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-systemd-units\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475882 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-var-lib-openvswitch\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475934 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-cni-bin\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-ovnkube-config\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476033 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-cni-bin\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.475989 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-systemd-units\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476105 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-env-overrides\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476149 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476192 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-systemd\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476223 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgsk2\" (UniqueName: \"kubernetes.io/projected/dcea46c0-dde1-4377-95f5-841fd373c386-kube-api-access-kgsk2\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476274 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-log-socket\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcea46c0-dde1-4377-95f5-841fd373c386-ovn-node-metrics-cert\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476321 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-systemd\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476342 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-slash\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476373 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-slash\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476414 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-run-ovn-kubernetes\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476426 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476492 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-ovn\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476457 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-run-ovn\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476522 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-run-ovn-kubernetes\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476547 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-kubelet\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476536 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-log-socket\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476595 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dcea46c0-dde1-4377-95f5-841fd373c386-host-kubelet\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476805 4691 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3605748c-8980-4aa9-8d28-f18a17aa8124-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476823 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3605748c-8980-4aa9-8d28-f18a17aa8124-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.476836 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8q22\" (UniqueName: \"kubernetes.io/projected/3605748c-8980-4aa9-8d28-f18a17aa8124-kube-api-access-s8q22\") on node \"crc\" DevicePath \"\"" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.477228 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-ovnkube-script-lib\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.477607 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-env-overrides\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.478023 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dcea46c0-dde1-4377-95f5-841fd373c386-ovnkube-config\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.480251 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dcea46c0-dde1-4377-95f5-841fd373c386-ovn-node-metrics-cert\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.496836 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgsk2\" (UniqueName: \"kubernetes.io/projected/dcea46c0-dde1-4377-95f5-841fd373c386-kube-api-access-kgsk2\") pod \"ovnkube-node-rf2wl\" (UID: \"dcea46c0-dde1-4377-95f5-841fd373c386\") " pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: I1202 07:56:16.602618 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:16 crc kubenswrapper[4691]: W1202 07:56:16.623175 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcea46c0_dde1_4377_95f5_841fd373c386.slice/crio-1209f1b33fa7719b72ed97624b530322992c14112d938d2532852c7729393a14 WatchSource:0}: Error finding container 1209f1b33fa7719b72ed97624b530322992c14112d938d2532852c7729393a14: Status 404 returned error can't find the container with id 1209f1b33fa7719b72ed97624b530322992c14112d938d2532852c7729393a14 Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.223216 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovn-acl-logging/0.log" Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.224281 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-7pgxh_3605748c-8980-4aa9-8d28-f18a17aa8124/ovn-controller/0.log" Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.224937 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7pgxh" Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.226690 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/2.log" Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.230223 4691 generic.go:334] "Generic (PLEG): container finished" podID="dcea46c0-dde1-4377-95f5-841fd373c386" containerID="0ff432ae86c40f52eb6526d1693ed99bed22f6464b3936844de088c2f39467e4" exitCode=0 Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.230328 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerDied","Data":"0ff432ae86c40f52eb6526d1693ed99bed22f6464b3936844de088c2f39467e4"} Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.230386 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"1209f1b33fa7719b72ed97624b530322992c14112d938d2532852c7729393a14"} Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.318723 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pgxh"] Dec 02 07:56:17 crc kubenswrapper[4691]: I1202 07:56:17.322373 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7pgxh"] Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.241490 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"a43c80f3a805c1d17595af68ee39b4791e12b0ef42620ac1f0d5d207e08634c1"} Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.241873 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"8c6442dfa72f6d98c938838815c8418cdaec72cc80841060bf86cca55de206ee"} Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.241889 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"51fcca6e5f637b2d03c856030b81ac0440597e4b5ca8355f425d9d7f53c7d9eb"} Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.241901 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"bf1f46099e2e0eb857da4a0f0571747c3dcc6454143367944c5cb39f56ea0591"} Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.241913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"b909bc7fdd75ae2e40e900dcef372e063ec2080505830716d79c481a5a4fc722"} Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.241930 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"7abbc4e7d85983d44f5da9327488fc3dd708a01934c7897a870225abf2cb4dcf"} Dec 02 07:56:18 crc kubenswrapper[4691]: I1202 07:56:18.568622 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3605748c-8980-4aa9-8d28-f18a17aa8124" path="/var/lib/kubelet/pods/3605748c-8980-4aa9-8d28-f18a17aa8124/volumes" Dec 02 07:56:20 crc kubenswrapper[4691]: I1202 07:56:20.259669 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"37778d7d4009c7713156152bf2e3eb6835ffbbedaeb38790fa5984590cdc7067"} Dec 02 07:56:21 crc kubenswrapper[4691]: I1202 07:56:21.898615 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:56:21 crc kubenswrapper[4691]: I1202 07:56:21.898972 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:56:21 crc kubenswrapper[4691]: I1202 07:56:21.899023 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:56:21 crc kubenswrapper[4691]: I1202 07:56:21.899683 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a16996ecf2c98a339a05624da8a98affe6240ab365ac144c76fc73906ae11b70"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:56:21 crc kubenswrapper[4691]: I1202 07:56:21.899742 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://a16996ecf2c98a339a05624da8a98affe6240ab365ac144c76fc73906ae11b70" gracePeriod=600 Dec 02 07:56:22 crc kubenswrapper[4691]: I1202 07:56:22.277046 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="a16996ecf2c98a339a05624da8a98affe6240ab365ac144c76fc73906ae11b70" exitCode=0 Dec 02 07:56:22 crc kubenswrapper[4691]: I1202 07:56:22.277100 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"a16996ecf2c98a339a05624da8a98affe6240ab365ac144c76fc73906ae11b70"} Dec 02 07:56:22 crc kubenswrapper[4691]: I1202 07:56:22.277130 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"49c546328dbd8547e0ff1dcfee99f503a31e4448db0773f3ebd91ead3aa35f8b"} Dec 02 07:56:22 crc kubenswrapper[4691]: I1202 07:56:22.277147 4691 scope.go:117] "RemoveContainer" containerID="bb78041367ae6920b31cd251da35411d957791f1be4b05d33750afac0123755e" Dec 02 07:56:23 crc kubenswrapper[4691]: I1202 07:56:23.288065 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" event={"ID":"dcea46c0-dde1-4377-95f5-841fd373c386","Type":"ContainerStarted","Data":"0a05e0b15de6877340d2d3718b149f39aab1683cf039c3525dd9069a5f10833f"} Dec 02 07:56:23 crc kubenswrapper[4691]: I1202 07:56:23.288624 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:23 crc kubenswrapper[4691]: I1202 07:56:23.288641 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:23 crc kubenswrapper[4691]: I1202 07:56:23.315680 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:23 crc kubenswrapper[4691]: I1202 07:56:23.321987 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" podStartSLOduration=7.321967692 podStartE2EDuration="7.321967692s" podCreationTimestamp="2025-12-02 07:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:56:23.320986107 +0000 UTC m=+631.105064989" watchObservedRunningTime="2025-12-02 07:56:23.321967692 +0000 UTC m=+631.106046574" Dec 02 07:56:24 crc kubenswrapper[4691]: I1202 07:56:24.293481 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:24 crc kubenswrapper[4691]: I1202 07:56:24.324659 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:28 crc kubenswrapper[4691]: I1202 07:56:28.566884 4691 scope.go:117] "RemoveContainer" containerID="9a3b02e6e070c37eaa327e48115ddfe37fb61e3e7f06d3a121542f798fd2097f" Dec 02 07:56:28 crc kubenswrapper[4691]: E1202 07:56:28.567447 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6gcsh_openshift-multus(eb6171dd-c2ea-4c52-b906-e8a9a7ff6537)\"" pod="openshift-multus/multus-6gcsh" podUID="eb6171dd-c2ea-4c52-b906-e8a9a7ff6537" Dec 02 07:56:40 crc kubenswrapper[4691]: I1202 07:56:40.562482 4691 scope.go:117] "RemoveContainer" containerID="9a3b02e6e070c37eaa327e48115ddfe37fb61e3e7f06d3a121542f798fd2097f" Dec 02 07:56:41 crc kubenswrapper[4691]: I1202 07:56:41.397984 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6gcsh_eb6171dd-c2ea-4c52-b906-e8a9a7ff6537/kube-multus/2.log" Dec 02 07:56:41 crc kubenswrapper[4691]: I1202 07:56:41.398356 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6gcsh" event={"ID":"eb6171dd-c2ea-4c52-b906-e8a9a7ff6537","Type":"ContainerStarted","Data":"251bd679354d57e7b408395c4f498768e38e0c666cdb7bd556179e041eb7123f"} Dec 02 07:56:46 crc kubenswrapper[4691]: I1202 07:56:46.625548 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rf2wl" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.722236 4691 scope.go:117] "RemoveContainer" containerID="8829984e2bb1eff5e90a1d87c7664672a1cbaf35fb0bd94da472aab0fc2d89d0" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.752447 4691 scope.go:117] "RemoveContainer" containerID="465656ff62053df8df9ff25d72a2aad7bffdca4dd0dd50f9f75dc13738abee29" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.772835 4691 scope.go:117] "RemoveContainer" containerID="d34d9b95b4f723cfd794a9dbf63893873b13f5049b85cf78c3891bc6ec2f957c" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.796319 4691 scope.go:117] "RemoveContainer" containerID="46432f3b5d689a4723cd4e1ad9369262b62ecbcd4b42e0fbbc15ac84079f217c" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.819676 4691 scope.go:117] "RemoveContainer" containerID="0a5db1288676acd22e6bdb6f14c182c8bb9c93429b8c8b433d96b27c2b5e544e" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.839602 4691 scope.go:117] "RemoveContainer" containerID="64c334b693cd12d45fdccda71675b007ed20fb71836a28b414fa6dc99c790340" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.858625 4691 scope.go:117] "RemoveContainer" containerID="643c4e9c55621ccf3998ba500d316dca4ad3009e81454985bb271a45340cf0f4" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.872875 4691 scope.go:117] "RemoveContainer" containerID="19b652ce71748c59585750503e3780af968384b603bf5962dff1919aba39757a" Dec 02 07:56:52 crc kubenswrapper[4691]: I1202 07:56:52.894359 4691 scope.go:117] "RemoveContainer" containerID="a162e1845346d910c9f307cd8d04a95dfaed05f10c8add5e2ecf2ee49febc518" Dec 02 07:56:57 crc kubenswrapper[4691]: I1202 07:56:57.941497 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg"] Dec 02 07:56:57 crc kubenswrapper[4691]: I1202 07:56:57.943112 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:57 crc kubenswrapper[4691]: I1202 07:56:57.952213 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 07:56:57 crc kubenswrapper[4691]: I1202 07:56:57.955858 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg"] Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.039455 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4p5p\" (UniqueName: \"kubernetes.io/projected/9f953091-57b0-4169-81cc-16a8bbf4a356-kube-api-access-r4p5p\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.039515 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.039562 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.141638 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.141776 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4p5p\" (UniqueName: \"kubernetes.io/projected/9f953091-57b0-4169-81cc-16a8bbf4a356-kube-api-access-r4p5p\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.141817 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.142307 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.142401 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.162347 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4p5p\" (UniqueName: \"kubernetes.io/projected/9f953091-57b0-4169-81cc-16a8bbf4a356-kube-api-access-r4p5p\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.271086 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:56:58 crc kubenswrapper[4691]: I1202 07:56:58.538256 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg"] Dec 02 07:56:59 crc kubenswrapper[4691]: I1202 07:56:59.513590 4691 generic.go:334] "Generic (PLEG): container finished" podID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerID="fc16f024f6d0f55fe5e0870dbd89ac222b91144c714ca68d7ea47fcdf4a543a3" exitCode=0 Dec 02 07:56:59 crc kubenswrapper[4691]: I1202 07:56:59.513733 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" event={"ID":"9f953091-57b0-4169-81cc-16a8bbf4a356","Type":"ContainerDied","Data":"fc16f024f6d0f55fe5e0870dbd89ac222b91144c714ca68d7ea47fcdf4a543a3"} Dec 02 07:56:59 crc kubenswrapper[4691]: I1202 07:56:59.513944 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" event={"ID":"9f953091-57b0-4169-81cc-16a8bbf4a356","Type":"ContainerStarted","Data":"b0d10ec53a2f83114caaa1f86eaedf31ba38e61e2e573270bc4792f621f42587"} Dec 02 07:57:01 crc kubenswrapper[4691]: I1202 07:57:01.530325 4691 generic.go:334] "Generic (PLEG): container finished" podID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerID="5f650cad28a4195e470065a0527f5a2d5435a9c75ceacb454298d5d8178438c6" exitCode=0 Dec 02 07:57:01 crc kubenswrapper[4691]: I1202 07:57:01.530376 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" event={"ID":"9f953091-57b0-4169-81cc-16a8bbf4a356","Type":"ContainerDied","Data":"5f650cad28a4195e470065a0527f5a2d5435a9c75ceacb454298d5d8178438c6"} Dec 02 07:57:02 crc kubenswrapper[4691]: I1202 07:57:02.543243 4691 generic.go:334] "Generic (PLEG): container finished" podID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerID="66bc8bec692a1b98684d164535b493b6ef08993f0a400b8fd1a5c4bff316651b" exitCode=0 Dec 02 07:57:02 crc kubenswrapper[4691]: I1202 07:57:02.543321 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" event={"ID":"9f953091-57b0-4169-81cc-16a8bbf4a356","Type":"ContainerDied","Data":"66bc8bec692a1b98684d164535b493b6ef08993f0a400b8fd1a5c4bff316651b"} Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.740846 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.826923 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-bundle\") pod \"9f953091-57b0-4169-81cc-16a8bbf4a356\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.827026 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-util\") pod \"9f953091-57b0-4169-81cc-16a8bbf4a356\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.827233 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4p5p\" (UniqueName: \"kubernetes.io/projected/9f953091-57b0-4169-81cc-16a8bbf4a356-kube-api-access-r4p5p\") pod \"9f953091-57b0-4169-81cc-16a8bbf4a356\" (UID: \"9f953091-57b0-4169-81cc-16a8bbf4a356\") " Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.827566 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-bundle" (OuterVolumeSpecName: "bundle") pod "9f953091-57b0-4169-81cc-16a8bbf4a356" (UID: "9f953091-57b0-4169-81cc-16a8bbf4a356"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.833862 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f953091-57b0-4169-81cc-16a8bbf4a356-kube-api-access-r4p5p" (OuterVolumeSpecName: "kube-api-access-r4p5p") pod "9f953091-57b0-4169-81cc-16a8bbf4a356" (UID: "9f953091-57b0-4169-81cc-16a8bbf4a356"). InnerVolumeSpecName "kube-api-access-r4p5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.929366 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:03 crc kubenswrapper[4691]: I1202 07:57:03.929412 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4p5p\" (UniqueName: \"kubernetes.io/projected/9f953091-57b0-4169-81cc-16a8bbf4a356-kube-api-access-r4p5p\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:04 crc kubenswrapper[4691]: I1202 07:57:04.104558 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-util" (OuterVolumeSpecName: "util") pod "9f953091-57b0-4169-81cc-16a8bbf4a356" (UID: "9f953091-57b0-4169-81cc-16a8bbf4a356"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:57:04 crc kubenswrapper[4691]: I1202 07:57:04.131819 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9f953091-57b0-4169-81cc-16a8bbf4a356-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:04 crc kubenswrapper[4691]: I1202 07:57:04.559974 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" event={"ID":"9f953091-57b0-4169-81cc-16a8bbf4a356","Type":"ContainerDied","Data":"b0d10ec53a2f83114caaa1f86eaedf31ba38e61e2e573270bc4792f621f42587"} Dec 02 07:57:04 crc kubenswrapper[4691]: I1202 07:57:04.560027 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d10ec53a2f83114caaa1f86eaedf31ba38e61e2e573270bc4792f621f42587" Dec 02 07:57:04 crc kubenswrapper[4691]: I1202 07:57:04.560187 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.409104 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh"] Dec 02 07:57:05 crc kubenswrapper[4691]: E1202 07:57:05.409656 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="pull" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.409670 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="pull" Dec 02 07:57:05 crc kubenswrapper[4691]: E1202 07:57:05.409696 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="extract" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.409703 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="extract" Dec 02 07:57:05 crc kubenswrapper[4691]: E1202 07:57:05.409714 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="util" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.409721 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="util" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.409856 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f953091-57b0-4169-81cc-16a8bbf4a356" containerName="extract" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.410314 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.413271 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.413673 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-wmlx4" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.414012 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.419700 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh"] Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.548872 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kjfq\" (UniqueName: \"kubernetes.io/projected/28cc1c9c-b36a-4783-ba53-4504f085b70d-kube-api-access-4kjfq\") pod \"nmstate-operator-5b5b58f5c8-mztlh\" (UID: \"28cc1c9c-b36a-4783-ba53-4504f085b70d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.649898 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kjfq\" (UniqueName: \"kubernetes.io/projected/28cc1c9c-b36a-4783-ba53-4504f085b70d-kube-api-access-4kjfq\") pod \"nmstate-operator-5b5b58f5c8-mztlh\" (UID: \"28cc1c9c-b36a-4783-ba53-4504f085b70d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.674260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kjfq\" (UniqueName: \"kubernetes.io/projected/28cc1c9c-b36a-4783-ba53-4504f085b70d-kube-api-access-4kjfq\") pod \"nmstate-operator-5b5b58f5c8-mztlh\" (UID: \"28cc1c9c-b36a-4783-ba53-4504f085b70d\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" Dec 02 07:57:05 crc kubenswrapper[4691]: I1202 07:57:05.782082 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" Dec 02 07:57:06 crc kubenswrapper[4691]: I1202 07:57:06.181622 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh"] Dec 02 07:57:06 crc kubenswrapper[4691]: W1202 07:57:06.192986 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28cc1c9c_b36a_4783_ba53_4504f085b70d.slice/crio-bee3aaff9237d24bc3f3390703a01e1378186cbd7dd1fc488e224c4923615a76 WatchSource:0}: Error finding container bee3aaff9237d24bc3f3390703a01e1378186cbd7dd1fc488e224c4923615a76: Status 404 returned error can't find the container with id bee3aaff9237d24bc3f3390703a01e1378186cbd7dd1fc488e224c4923615a76 Dec 02 07:57:06 crc kubenswrapper[4691]: I1202 07:57:06.571916 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" event={"ID":"28cc1c9c-b36a-4783-ba53-4504f085b70d","Type":"ContainerStarted","Data":"bee3aaff9237d24bc3f3390703a01e1378186cbd7dd1fc488e224c4923615a76"} Dec 02 07:57:09 crc kubenswrapper[4691]: I1202 07:57:09.589540 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" event={"ID":"28cc1c9c-b36a-4783-ba53-4504f085b70d","Type":"ContainerStarted","Data":"50184d0300d16b672a96f1eec07cff7fb025f5fc6249d369eeaa2d391855e588"} Dec 02 07:57:09 crc kubenswrapper[4691]: I1202 07:57:09.615350 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-mztlh" podStartSLOduration=2.124231944 podStartE2EDuration="4.615333791s" podCreationTimestamp="2025-12-02 07:57:05 +0000 UTC" firstStartedPulling="2025-12-02 07:57:06.194632299 +0000 UTC m=+673.978711161" lastFinishedPulling="2025-12-02 07:57:08.685734146 +0000 UTC m=+676.469813008" observedRunningTime="2025-12-02 07:57:09.614839578 +0000 UTC m=+677.398918460" watchObservedRunningTime="2025-12-02 07:57:09.615333791 +0000 UTC m=+677.399412643" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.620510 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.621498 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.623568 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-lbtpb" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.635598 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.636664 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.638490 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.648179 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.654430 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-7scpl"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.655587 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.701912 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.715619 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ef02c668-a715-48cb-8efb-9c52cdb28e9d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.715745 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brp8r\" (UniqueName: \"kubernetes.io/projected/dec2c181-9b51-4e2b-95d1-d98fa9102b3a-kube-api-access-brp8r\") pod \"nmstate-metrics-7f946cbc9-f2q6n\" (UID: \"dec2c181-9b51-4e2b-95d1-d98fa9102b3a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.715784 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t6m4\" (UniqueName: \"kubernetes.io/projected/ef02c668-a715-48cb-8efb-9c52cdb28e9d-kube-api-access-8t6m4\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.758254 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.759257 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.761533 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.761533 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jgqrd" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.761534 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.768420 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.817339 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brp8r\" (UniqueName: \"kubernetes.io/projected/dec2c181-9b51-4e2b-95d1-d98fa9102b3a-kube-api-access-brp8r\") pod \"nmstate-metrics-7f946cbc9-f2q6n\" (UID: \"dec2c181-9b51-4e2b-95d1-d98fa9102b3a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.817384 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-nmstate-lock\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.817408 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47492\" (UniqueName: \"kubernetes.io/projected/c0ce4e86-7cec-4db1-975d-51ee41f94337-kube-api-access-47492\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.817529 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t6m4\" (UniqueName: \"kubernetes.io/projected/ef02c668-a715-48cb-8efb-9c52cdb28e9d-kube-api-access-8t6m4\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.817593 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-ovs-socket\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.818295 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-dbus-socket\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.818370 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ef02c668-a715-48cb-8efb-9c52cdb28e9d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:10 crc kubenswrapper[4691]: E1202 07:57:10.818528 4691 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Dec 02 07:57:10 crc kubenswrapper[4691]: E1202 07:57:10.818586 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef02c668-a715-48cb-8efb-9c52cdb28e9d-tls-key-pair podName:ef02c668-a715-48cb-8efb-9c52cdb28e9d nodeName:}" failed. No retries permitted until 2025-12-02 07:57:11.318568252 +0000 UTC m=+679.102647114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ef02c668-a715-48cb-8efb-9c52cdb28e9d-tls-key-pair") pod "nmstate-webhook-5f6d4c5ccb-8slld" (UID: "ef02c668-a715-48cb-8efb-9c52cdb28e9d") : secret "openshift-nmstate-webhook" not found Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.834727 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brp8r\" (UniqueName: \"kubernetes.io/projected/dec2c181-9b51-4e2b-95d1-d98fa9102b3a-kube-api-access-brp8r\") pod \"nmstate-metrics-7f946cbc9-f2q6n\" (UID: \"dec2c181-9b51-4e2b-95d1-d98fa9102b3a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.839582 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t6m4\" (UniqueName: \"kubernetes.io/projected/ef02c668-a715-48cb-8efb-9c52cdb28e9d-kube-api-access-8t6m4\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919130 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919255 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-nmstate-lock\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47492\" (UniqueName: \"kubernetes.io/projected/c0ce4e86-7cec-4db1-975d-51ee41f94337-kube-api-access-47492\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919325 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-ovs-socket\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919354 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xftfx\" (UniqueName: \"kubernetes.io/projected/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-kube-api-access-xftfx\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919381 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-dbus-socket\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919637 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-dbus-socket\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919701 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-ovs-socket\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.919710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/c0ce4e86-7cec-4db1-975d-51ee41f94337-nmstate-lock\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.936260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47492\" (UniqueName: \"kubernetes.io/projected/c0ce4e86-7cec-4db1-975d-51ee41f94337-kube-api-access-47492\") pod \"nmstate-handler-7scpl\" (UID: \"c0ce4e86-7cec-4db1-975d-51ee41f94337\") " pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.936572 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.939940 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-57cbf6ff7f-tn64r"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.940785 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.954593 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cbf6ff7f-tn64r"] Dec 02 07:57:10 crc kubenswrapper[4691]: I1202 07:57:10.983142 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020576 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020621 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-oauth-serving-cert\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020646 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-trusted-ca-bundle\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020680 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvlz\" (UniqueName: \"kubernetes.io/projected/48d43e1d-cc61-4a60-9636-dce4d726b1c6-kube-api-access-hpvlz\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020705 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-config\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020745 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-service-ca\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020779 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xftfx\" (UniqueName: \"kubernetes.io/projected/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-kube-api-access-xftfx\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.020968 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-oauth-config\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.021025 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.021086 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-serving-cert\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.021533 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.025343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.044961 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xftfx\" (UniqueName: \"kubernetes.io/projected/e4e7cee0-91cd-406a-a496-a13b4ee91e1e-kube-api-access-xftfx\") pod \"nmstate-console-plugin-7fbb5f6569-wqdhp\" (UID: \"e4e7cee0-91cd-406a-a496-a13b4ee91e1e\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: W1202 07:57:11.053874 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0ce4e86_7cec_4db1_975d_51ee41f94337.slice/crio-c8063e1cac678214d51f052bc5df4c28d9f1faa1d09e2714b32f483c4d13a233 WatchSource:0}: Error finding container c8063e1cac678214d51f052bc5df4c28d9f1faa1d09e2714b32f483c4d13a233: Status 404 returned error can't find the container with id c8063e1cac678214d51f052bc5df4c28d9f1faa1d09e2714b32f483c4d13a233 Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.076283 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.122328 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-service-ca\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.122927 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-oauth-config\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.122984 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-serving-cert\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.123025 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-oauth-serving-cert\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.123058 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-trusted-ca-bundle\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.123085 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvlz\" (UniqueName: \"kubernetes.io/projected/48d43e1d-cc61-4a60-9636-dce4d726b1c6-kube-api-access-hpvlz\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.123111 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-config\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.124022 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-service-ca\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.124417 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-config\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.124844 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-oauth-serving-cert\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.125817 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48d43e1d-cc61-4a60-9636-dce4d726b1c6-trusted-ca-bundle\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.127489 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-oauth-config\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.128829 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48d43e1d-cc61-4a60-9636-dce4d726b1c6-console-serving-cert\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.142794 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvlz\" (UniqueName: \"kubernetes.io/projected/48d43e1d-cc61-4a60-9636-dce4d726b1c6-kube-api-access-hpvlz\") pod \"console-57cbf6ff7f-tn64r\" (UID: \"48d43e1d-cc61-4a60-9636-dce4d726b1c6\") " pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.296400 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.306903 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n"] Dec 02 07:57:11 crc kubenswrapper[4691]: W1202 07:57:11.312997 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddec2c181_9b51_4e2b_95d1_d98fa9102b3a.slice/crio-ce7fca8a6f5eb9a0dd7762a3a8c6c5420bf7eef08797cbcfcd22baee072a37b8 WatchSource:0}: Error finding container ce7fca8a6f5eb9a0dd7762a3a8c6c5420bf7eef08797cbcfcd22baee072a37b8: Status 404 returned error can't find the container with id ce7fca8a6f5eb9a0dd7762a3a8c6c5420bf7eef08797cbcfcd22baee072a37b8 Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.326098 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ef02c668-a715-48cb-8efb-9c52cdb28e9d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.332848 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ef02c668-a715-48cb-8efb-9c52cdb28e9d-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-8slld\" (UID: \"ef02c668-a715-48cb-8efb-9c52cdb28e9d\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.375299 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp"] Dec 02 07:57:11 crc kubenswrapper[4691]: W1202 07:57:11.385849 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e7cee0_91cd_406a_a496_a13b4ee91e1e.slice/crio-219e04d949b988d17fb23d8a0d590a5265b995a27e3a23f3e24de5e3c774f76a WatchSource:0}: Error finding container 219e04d949b988d17fb23d8a0d590a5265b995a27e3a23f3e24de5e3c774f76a: Status 404 returned error can't find the container with id 219e04d949b988d17fb23d8a0d590a5265b995a27e3a23f3e24de5e3c774f76a Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.523235 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-57cbf6ff7f-tn64r"] Dec 02 07:57:11 crc kubenswrapper[4691]: W1202 07:57:11.532873 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48d43e1d_cc61_4a60_9636_dce4d726b1c6.slice/crio-2a31811e3888e1f6a0c90ee589342b6ea665ff9c8079fd4a9ec8c4c7c71fa444 WatchSource:0}: Error finding container 2a31811e3888e1f6a0c90ee589342b6ea665ff9c8079fd4a9ec8c4c7c71fa444: Status 404 returned error can't find the container with id 2a31811e3888e1f6a0c90ee589342b6ea665ff9c8079fd4a9ec8c4c7c71fa444 Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.557873 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.602519 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cbf6ff7f-tn64r" event={"ID":"48d43e1d-cc61-4a60-9636-dce4d726b1c6","Type":"ContainerStarted","Data":"2a31811e3888e1f6a0c90ee589342b6ea665ff9c8079fd4a9ec8c4c7c71fa444"} Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.603578 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" event={"ID":"e4e7cee0-91cd-406a-a496-a13b4ee91e1e","Type":"ContainerStarted","Data":"219e04d949b988d17fb23d8a0d590a5265b995a27e3a23f3e24de5e3c774f76a"} Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.605041 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" event={"ID":"dec2c181-9b51-4e2b-95d1-d98fa9102b3a","Type":"ContainerStarted","Data":"ce7fca8a6f5eb9a0dd7762a3a8c6c5420bf7eef08797cbcfcd22baee072a37b8"} Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.606715 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7scpl" event={"ID":"c0ce4e86-7cec-4db1-975d-51ee41f94337","Type":"ContainerStarted","Data":"c8063e1cac678214d51f052bc5df4c28d9f1faa1d09e2714b32f483c4d13a233"} Dec 02 07:57:11 crc kubenswrapper[4691]: I1202 07:57:11.943720 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld"] Dec 02 07:57:12 crc kubenswrapper[4691]: I1202 07:57:12.613719 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-57cbf6ff7f-tn64r" event={"ID":"48d43e1d-cc61-4a60-9636-dce4d726b1c6","Type":"ContainerStarted","Data":"91db274d81faf60ca2b4e935a30047b9fafb2d388b5186d50fb86af22cc90d58"} Dec 02 07:57:12 crc kubenswrapper[4691]: I1202 07:57:12.614963 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" event={"ID":"ef02c668-a715-48cb-8efb-9c52cdb28e9d","Type":"ContainerStarted","Data":"e33fe23b1a9a422af09ed62221b312c810af2a1ea3dbda1412b181156155b6f8"} Dec 02 07:57:12 crc kubenswrapper[4691]: I1202 07:57:12.654164 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-57cbf6ff7f-tn64r" podStartSLOduration=2.654129962 podStartE2EDuration="2.654129962s" podCreationTimestamp="2025-12-02 07:57:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:57:12.65243307 +0000 UTC m=+680.436511942" watchObservedRunningTime="2025-12-02 07:57:12.654129962 +0000 UTC m=+680.438208834" Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.638901 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" event={"ID":"e4e7cee0-91cd-406a-a496-a13b4ee91e1e","Type":"ContainerStarted","Data":"989c786efdfaa3fc36846588f05626c8771fb4c8e8a7b45415f600591e13fd70"} Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.640922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" event={"ID":"ef02c668-a715-48cb-8efb-9c52cdb28e9d","Type":"ContainerStarted","Data":"dc3f793f2cd3e41447eb4098960e9422287b3a7e837616981a74e4473013a21b"} Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.641101 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.642453 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" event={"ID":"dec2c181-9b51-4e2b-95d1-d98fa9102b3a","Type":"ContainerStarted","Data":"57500fe60756773c6391c529d9160b3f82be65b5ca017dd99ec9fe0205dc5341"} Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.643629 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-7scpl" event={"ID":"c0ce4e86-7cec-4db1-975d-51ee41f94337","Type":"ContainerStarted","Data":"95fae3cb159f3ad50a58e5c7b6d4caff34fa4513bc4e5d16e38ab219f358836b"} Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.643775 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.659832 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-wqdhp" podStartSLOduration=1.7489649950000001 podStartE2EDuration="5.659753378s" podCreationTimestamp="2025-12-02 07:57:10 +0000 UTC" firstStartedPulling="2025-12-02 07:57:11.388362816 +0000 UTC m=+679.172441678" lastFinishedPulling="2025-12-02 07:57:15.299151199 +0000 UTC m=+683.083230061" observedRunningTime="2025-12-02 07:57:15.655258606 +0000 UTC m=+683.439337468" watchObservedRunningTime="2025-12-02 07:57:15.659753378 +0000 UTC m=+683.443832240" Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.714869 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-7scpl" podStartSLOduration=1.467075924 podStartE2EDuration="5.714845449s" podCreationTimestamp="2025-12-02 07:57:10 +0000 UTC" firstStartedPulling="2025-12-02 07:57:11.057783643 +0000 UTC m=+678.841862505" lastFinishedPulling="2025-12-02 07:57:15.305553168 +0000 UTC m=+683.089632030" observedRunningTime="2025-12-02 07:57:15.713027943 +0000 UTC m=+683.497106815" watchObservedRunningTime="2025-12-02 07:57:15.714845449 +0000 UTC m=+683.498924321" Dec 02 07:57:15 crc kubenswrapper[4691]: I1202 07:57:15.732886 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" podStartSLOduration=2.379927531 podStartE2EDuration="5.732861677s" podCreationTimestamp="2025-12-02 07:57:10 +0000 UTC" firstStartedPulling="2025-12-02 07:57:11.956592601 +0000 UTC m=+679.740671463" lastFinishedPulling="2025-12-02 07:57:15.309526747 +0000 UTC m=+683.093605609" observedRunningTime="2025-12-02 07:57:15.731944654 +0000 UTC m=+683.516023536" watchObservedRunningTime="2025-12-02 07:57:15.732861677 +0000 UTC m=+683.516940539" Dec 02 07:57:18 crc kubenswrapper[4691]: I1202 07:57:18.665912 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" event={"ID":"dec2c181-9b51-4e2b-95d1-d98fa9102b3a","Type":"ContainerStarted","Data":"4e2804e36034475714f0db9ed84cc6275e4cb5d99b5a1c1ebd592fe16b87f820"} Dec 02 07:57:18 crc kubenswrapper[4691]: I1202 07:57:18.702645 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-f2q6n" podStartSLOduration=2.5675200990000002 podStartE2EDuration="8.702611162s" podCreationTimestamp="2025-12-02 07:57:10 +0000 UTC" firstStartedPulling="2025-12-02 07:57:11.317972625 +0000 UTC m=+679.102051507" lastFinishedPulling="2025-12-02 07:57:17.453063708 +0000 UTC m=+685.237142570" observedRunningTime="2025-12-02 07:57:18.693036194 +0000 UTC m=+686.477115056" watchObservedRunningTime="2025-12-02 07:57:18.702611162 +0000 UTC m=+686.486690044" Dec 02 07:57:21 crc kubenswrapper[4691]: I1202 07:57:21.015917 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-7scpl" Dec 02 07:57:21 crc kubenswrapper[4691]: I1202 07:57:21.296988 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:21 crc kubenswrapper[4691]: I1202 07:57:21.297243 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:21 crc kubenswrapper[4691]: I1202 07:57:21.301577 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:21 crc kubenswrapper[4691]: I1202 07:57:21.688186 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-57cbf6ff7f-tn64r" Dec 02 07:57:21 crc kubenswrapper[4691]: I1202 07:57:21.748285 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wx6m2"] Dec 02 07:57:31 crc kubenswrapper[4691]: I1202 07:57:31.564657 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-8slld" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.270869 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk"] Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.272359 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.274348 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.281683 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk"] Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.379168 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt9kj\" (UniqueName: \"kubernetes.io/projected/e067e362-dd94-4d98-83b9-e3108fbdef06-kube-api-access-mt9kj\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.379229 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.379274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.480556 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt9kj\" (UniqueName: \"kubernetes.io/projected/e067e362-dd94-4d98-83b9-e3108fbdef06-kube-api-access-mt9kj\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.480603 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.480636 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.481306 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.481386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.498501 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt9kj\" (UniqueName: \"kubernetes.io/projected/e067e362-dd94-4d98-83b9-e3108fbdef06-kube-api-access-mt9kj\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.589801 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.836457 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-wx6m2" podUID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" containerName="console" containerID="cri-o://f0d8a41f951868ea3425f4c830e96144fc129178e40d31649ba6dd18b201854e" gracePeriod=15 Dec 02 07:57:46 crc kubenswrapper[4691]: I1202 07:57:46.993151 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk"] Dec 02 07:57:47 crc kubenswrapper[4691]: I1202 07:57:47.929201 4691 generic.go:334] "Generic (PLEG): container finished" podID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerID="2e30f2b096ca46f170b8d8688a2f7e992482520589179db588b082ec94149eb5" exitCode=0 Dec 02 07:57:47 crc kubenswrapper[4691]: I1202 07:57:47.929590 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" event={"ID":"e067e362-dd94-4d98-83b9-e3108fbdef06","Type":"ContainerDied","Data":"2e30f2b096ca46f170b8d8688a2f7e992482520589179db588b082ec94149eb5"} Dec 02 07:57:47 crc kubenswrapper[4691]: I1202 07:57:47.929626 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" event={"ID":"e067e362-dd94-4d98-83b9-e3108fbdef06","Type":"ContainerStarted","Data":"4d5969041d2f38eee149f2130a381dd801b5e5a2746ccc93c4060f5ef04037e5"} Dec 02 07:57:47 crc kubenswrapper[4691]: I1202 07:57:47.931512 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wx6m2_c194920b-f7b2-4cb0-ae33-a2136ecbd2b4/console/0.log" Dec 02 07:57:47 crc kubenswrapper[4691]: I1202 07:57:47.931557 4691 generic.go:334] "Generic (PLEG): container finished" podID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" containerID="f0d8a41f951868ea3425f4c830e96144fc129178e40d31649ba6dd18b201854e" exitCode=2 Dec 02 07:57:47 crc kubenswrapper[4691]: I1202 07:57:47.931593 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wx6m2" event={"ID":"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4","Type":"ContainerDied","Data":"f0d8a41f951868ea3425f4c830e96144fc129178e40d31649ba6dd18b201854e"} Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.317538 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wx6m2_c194920b-f7b2-4cb0-ae33-a2136ecbd2b4/console/0.log" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.317603 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457077 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkwgt\" (UniqueName: \"kubernetes.io/projected/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-kube-api-access-rkwgt\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457557 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-oauth-config\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457598 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-trusted-ca-bundle\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457621 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-oauth-serving-cert\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457712 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-service-ca\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457741 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-config\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.457785 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-serving-cert\") pod \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\" (UID: \"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4\") " Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.458905 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-service-ca" (OuterVolumeSpecName: "service-ca") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.458949 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.458968 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-config" (OuterVolumeSpecName: "console-config") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.458982 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.459444 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.459480 4691 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.459493 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-service-ca\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.459505 4691 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.464883 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.465234 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-kube-api-access-rkwgt" (OuterVolumeSpecName: "kube-api-access-rkwgt") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "kube-api-access-rkwgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.465555 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" (UID: "c194920b-f7b2-4cb0-ae33-a2136ecbd2b4"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.560472 4691 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.560509 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkwgt\" (UniqueName: \"kubernetes.io/projected/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-kube-api-access-rkwgt\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.560542 4691 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.938197 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-wx6m2_c194920b-f7b2-4cb0-ae33-a2136ecbd2b4/console/0.log" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.938251 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-wx6m2" event={"ID":"c194920b-f7b2-4cb0-ae33-a2136ecbd2b4","Type":"ContainerDied","Data":"ee21421ae4fb6db73c078384220c18aad2b38ce487892fc3d4db60dc0f8bb1c1"} Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.938286 4691 scope.go:117] "RemoveContainer" containerID="f0d8a41f951868ea3425f4c830e96144fc129178e40d31649ba6dd18b201854e" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.938300 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-wx6m2" Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.960495 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-wx6m2"] Dec 02 07:57:48 crc kubenswrapper[4691]: I1202 07:57:48.964432 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-wx6m2"] Dec 02 07:57:50 crc kubenswrapper[4691]: I1202 07:57:50.568142 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" path="/var/lib/kubelet/pods/c194920b-f7b2-4cb0-ae33-a2136ecbd2b4/volumes" Dec 02 07:57:50 crc kubenswrapper[4691]: I1202 07:57:50.950501 4691 generic.go:334] "Generic (PLEG): container finished" podID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerID="fc009751ac6830fc5502acb867f16a62dd3bc76b13f51d3af5a46371d1b3dae8" exitCode=0 Dec 02 07:57:50 crc kubenswrapper[4691]: I1202 07:57:50.950547 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" event={"ID":"e067e362-dd94-4d98-83b9-e3108fbdef06","Type":"ContainerDied","Data":"fc009751ac6830fc5502acb867f16a62dd3bc76b13f51d3af5a46371d1b3dae8"} Dec 02 07:57:51 crc kubenswrapper[4691]: I1202 07:57:51.958735 4691 generic.go:334] "Generic (PLEG): container finished" podID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerID="95755915acb64f61b45d3a714e13ad99b4f853a5f752d2e2be942459a206aa16" exitCode=0 Dec 02 07:57:51 crc kubenswrapper[4691]: I1202 07:57:51.958815 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" event={"ID":"e067e362-dd94-4d98-83b9-e3108fbdef06","Type":"ContainerDied","Data":"95755915acb64f61b45d3a714e13ad99b4f853a5f752d2e2be942459a206aa16"} Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.191327 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.220343 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-bundle\") pod \"e067e362-dd94-4d98-83b9-e3108fbdef06\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.220404 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mt9kj\" (UniqueName: \"kubernetes.io/projected/e067e362-dd94-4d98-83b9-e3108fbdef06-kube-api-access-mt9kj\") pod \"e067e362-dd94-4d98-83b9-e3108fbdef06\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.220426 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-util\") pod \"e067e362-dd94-4d98-83b9-e3108fbdef06\" (UID: \"e067e362-dd94-4d98-83b9-e3108fbdef06\") " Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.221516 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-bundle" (OuterVolumeSpecName: "bundle") pod "e067e362-dd94-4d98-83b9-e3108fbdef06" (UID: "e067e362-dd94-4d98-83b9-e3108fbdef06"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.227006 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e067e362-dd94-4d98-83b9-e3108fbdef06-kube-api-access-mt9kj" (OuterVolumeSpecName: "kube-api-access-mt9kj") pod "e067e362-dd94-4d98-83b9-e3108fbdef06" (UID: "e067e362-dd94-4d98-83b9-e3108fbdef06"). InnerVolumeSpecName "kube-api-access-mt9kj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.231298 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-util" (OuterVolumeSpecName: "util") pod "e067e362-dd94-4d98-83b9-e3108fbdef06" (UID: "e067e362-dd94-4d98-83b9-e3108fbdef06"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.321879 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mt9kj\" (UniqueName: \"kubernetes.io/projected/e067e362-dd94-4d98-83b9-e3108fbdef06-kube-api-access-mt9kj\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.321921 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.321937 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e067e362-dd94-4d98-83b9-e3108fbdef06-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.970536 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" event={"ID":"e067e362-dd94-4d98-83b9-e3108fbdef06","Type":"ContainerDied","Data":"4d5969041d2f38eee149f2130a381dd801b5e5a2746ccc93c4060f5ef04037e5"} Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.970580 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d5969041d2f38eee149f2130a381dd801b5e5a2746ccc93c4060f5ef04037e5" Dec 02 07:57:53 crc kubenswrapper[4691]: I1202 07:57:53.970587 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.329477 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j"] Dec 02 07:58:04 crc kubenswrapper[4691]: E1202 07:58:04.330302 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="extract" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.330320 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="extract" Dec 02 07:58:04 crc kubenswrapper[4691]: E1202 07:58:04.330336 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" containerName="console" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.330343 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" containerName="console" Dec 02 07:58:04 crc kubenswrapper[4691]: E1202 07:58:04.330367 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="util" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.330375 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="util" Dec 02 07:58:04 crc kubenswrapper[4691]: E1202 07:58:04.330384 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="pull" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.330391 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="pull" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.330515 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e067e362-dd94-4d98-83b9-e3108fbdef06" containerName="extract" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.330528 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c194920b-f7b2-4cb0-ae33-a2136ecbd2b4" containerName="console" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.331049 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.333496 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.333681 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.334937 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-4wwqk" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.341570 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.342467 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.357613 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j"] Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.373698 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpfvq\" (UniqueName: \"kubernetes.io/projected/f921d74e-40cb-430a-a228-ec4681e9251d-kube-api-access-fpfvq\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.373795 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f921d74e-40cb-430a-a228-ec4681e9251d-apiservice-cert\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.373836 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f921d74e-40cb-430a-a228-ec4681e9251d-webhook-cert\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.476815 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpfvq\" (UniqueName: \"kubernetes.io/projected/f921d74e-40cb-430a-a228-ec4681e9251d-kube-api-access-fpfvq\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.476979 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f921d74e-40cb-430a-a228-ec4681e9251d-apiservice-cert\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.477066 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f921d74e-40cb-430a-a228-ec4681e9251d-webhook-cert\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.505921 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f921d74e-40cb-430a-a228-ec4681e9251d-apiservice-cert\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.517584 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f921d74e-40cb-430a-a228-ec4681e9251d-webhook-cert\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.525865 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpfvq\" (UniqueName: \"kubernetes.io/projected/f921d74e-40cb-430a-a228-ec4681e9251d-kube-api-access-fpfvq\") pod \"metallb-operator-controller-manager-568f45db7d-dr99j\" (UID: \"f921d74e-40cb-430a-a228-ec4681e9251d\") " pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.648855 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.937960 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb"] Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.938896 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.941471 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.941672 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.941949 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-wdgb9" Dec 02 07:58:04 crc kubenswrapper[4691]: I1202 07:58:04.958202 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb"] Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.087115 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d3a05b7-702e-4564-9014-edffe6fc64ea-apiservice-cert\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.087235 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d3a05b7-702e-4564-9014-edffe6fc64ea-webhook-cert\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.087300 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl87f\" (UniqueName: \"kubernetes.io/projected/0d3a05b7-702e-4564-9014-edffe6fc64ea-kube-api-access-kl87f\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.189088 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d3a05b7-702e-4564-9014-edffe6fc64ea-webhook-cert\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.189199 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl87f\" (UniqueName: \"kubernetes.io/projected/0d3a05b7-702e-4564-9014-edffe6fc64ea-kube-api-access-kl87f\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.189294 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d3a05b7-702e-4564-9014-edffe6fc64ea-apiservice-cert\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.198781 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0d3a05b7-702e-4564-9014-edffe6fc64ea-webhook-cert\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.201472 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0d3a05b7-702e-4564-9014-edffe6fc64ea-apiservice-cert\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.209522 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl87f\" (UniqueName: \"kubernetes.io/projected/0d3a05b7-702e-4564-9014-edffe6fc64ea-kube-api-access-kl87f\") pod \"metallb-operator-webhook-server-656d54dc75-wplwb\" (UID: \"0d3a05b7-702e-4564-9014-edffe6fc64ea\") " pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.242465 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j"] Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.257752 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:05 crc kubenswrapper[4691]: I1202 07:58:05.829660 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb"] Dec 02 07:58:05 crc kubenswrapper[4691]: W1202 07:58:05.837945 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d3a05b7_702e_4564_9014_edffe6fc64ea.slice/crio-6c174fb3ed28f16b7b837959cf91de1e56dad461e519c97d049db3aa66b67e6d WatchSource:0}: Error finding container 6c174fb3ed28f16b7b837959cf91de1e56dad461e519c97d049db3aa66b67e6d: Status 404 returned error can't find the container with id 6c174fb3ed28f16b7b837959cf91de1e56dad461e519c97d049db3aa66b67e6d Dec 02 07:58:06 crc kubenswrapper[4691]: I1202 07:58:06.049013 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" event={"ID":"f921d74e-40cb-430a-a228-ec4681e9251d","Type":"ContainerStarted","Data":"a3dec020e00cdc2f1b2f115aa08414ece00b62c86c170d443d13e02712b910b0"} Dec 02 07:58:06 crc kubenswrapper[4691]: I1202 07:58:06.050141 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" event={"ID":"0d3a05b7-702e-4564-9014-edffe6fc64ea","Type":"ContainerStarted","Data":"6c174fb3ed28f16b7b837959cf91de1e56dad461e519c97d049db3aa66b67e6d"} Dec 02 07:58:17 crc kubenswrapper[4691]: I1202 07:58:17.282745 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" event={"ID":"f921d74e-40cb-430a-a228-ec4681e9251d","Type":"ContainerStarted","Data":"08cd3bb63c049824fec3915dbf2776fcfeeb5222039bf006c6e1253b889eca17"} Dec 02 07:58:17 crc kubenswrapper[4691]: I1202 07:58:17.283391 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:17 crc kubenswrapper[4691]: I1202 07:58:17.284217 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" event={"ID":"0d3a05b7-702e-4564-9014-edffe6fc64ea","Type":"ContainerStarted","Data":"03f92462c92fff8aea73a7c812b333ef87482c7c6a7871236071cb1777a60d68"} Dec 02 07:58:17 crc kubenswrapper[4691]: I1202 07:58:17.284351 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:17 crc kubenswrapper[4691]: I1202 07:58:17.301524 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" podStartSLOduration=2.276288208 podStartE2EDuration="13.301511272s" podCreationTimestamp="2025-12-02 07:58:04 +0000 UTC" firstStartedPulling="2025-12-02 07:58:05.260672712 +0000 UTC m=+733.044751574" lastFinishedPulling="2025-12-02 07:58:16.285895776 +0000 UTC m=+744.069974638" observedRunningTime="2025-12-02 07:58:17.30058765 +0000 UTC m=+745.084666522" watchObservedRunningTime="2025-12-02 07:58:17.301511272 +0000 UTC m=+745.085590124" Dec 02 07:58:17 crc kubenswrapper[4691]: I1202 07:58:17.323005 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" podStartSLOduration=2.869722621 podStartE2EDuration="13.32298906s" podCreationTimestamp="2025-12-02 07:58:04 +0000 UTC" firstStartedPulling="2025-12-02 07:58:05.840415845 +0000 UTC m=+733.624494707" lastFinishedPulling="2025-12-02 07:58:16.293682284 +0000 UTC m=+744.077761146" observedRunningTime="2025-12-02 07:58:17.319442315 +0000 UTC m=+745.103521177" watchObservedRunningTime="2025-12-02 07:58:17.32298906 +0000 UTC m=+745.107067922" Dec 02 07:58:18 crc kubenswrapper[4691]: I1202 07:58:18.550442 4691 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 02 07:58:35 crc kubenswrapper[4691]: I1202 07:58:35.261934 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-656d54dc75-wplwb" Dec 02 07:58:51 crc kubenswrapper[4691]: I1202 07:58:51.898843 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:58:51 crc kubenswrapper[4691]: I1202 07:58:51.899523 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:58:54 crc kubenswrapper[4691]: I1202 07:58:54.654625 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-568f45db7d-dr99j" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.541279 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-p9xxd"] Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.545549 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.547485 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc"] Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.548492 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.549656 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.549844 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.549958 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-td775" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.551586 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556437 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc"] Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7790b39b-8f2f-4637-8514-497457777b14-frr-startup\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556547 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxdts\" (UniqueName: \"kubernetes.io/projected/14bf1d46-7291-4940-9d53-9361142ad142-kube-api-access-lxdts\") pod \"frr-k8s-webhook-server-7fcb986d4-7vgpc\" (UID: \"14bf1d46-7291-4940-9d53-9361142ad142\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556644 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-reloader\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556694 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjct7\" (UniqueName: \"kubernetes.io/projected/7790b39b-8f2f-4637-8514-497457777b14-kube-api-access-pjct7\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556715 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-frr-conf\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556750 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7790b39b-8f2f-4637-8514-497457777b14-metrics-certs\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556812 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-frr-sockets\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556830 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14bf1d46-7291-4940-9d53-9361142ad142-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7vgpc\" (UID: \"14bf1d46-7291-4940-9d53-9361142ad142\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.556882 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-metrics\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.646404 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-bv29g"] Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.647520 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.649635 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.653259 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.655650 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.656797 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-78rlf" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658116 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjct7\" (UniqueName: \"kubernetes.io/projected/7790b39b-8f2f-4637-8514-497457777b14-kube-api-access-pjct7\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658154 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-frr-conf\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658187 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7790b39b-8f2f-4637-8514-497457777b14-metrics-certs\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658206 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/defc2342-4163-4319-afaa-fa2eb042082c-metallb-excludel2\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658229 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-frr-sockets\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658242 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14bf1d46-7291-4940-9d53-9361142ad142-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7vgpc\" (UID: \"14bf1d46-7291-4940-9d53-9361142ad142\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658260 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-metrics-certs\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658298 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-metrics\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658318 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7790b39b-8f2f-4637-8514-497457777b14-frr-startup\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658347 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxdts\" (UniqueName: \"kubernetes.io/projected/14bf1d46-7291-4940-9d53-9361142ad142-kube-api-access-lxdts\") pod \"frr-k8s-webhook-server-7fcb986d4-7vgpc\" (UID: \"14bf1d46-7291-4940-9d53-9361142ad142\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658364 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c26r\" (UniqueName: \"kubernetes.io/projected/defc2342-4163-4319-afaa-fa2eb042082c-kube-api-access-9c26r\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658406 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-reloader\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.658889 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-reloader\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.660226 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-frr-conf\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.660505 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-frr-sockets\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.661542 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7790b39b-8f2f-4637-8514-497457777b14-metrics\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.662130 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-zj6r5"] Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.662210 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7790b39b-8f2f-4637-8514-497457777b14-frr-startup\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.663280 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.668293 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.668957 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/14bf1d46-7291-4940-9d53-9361142ad142-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-7vgpc\" (UID: \"14bf1d46-7291-4940-9d53-9361142ad142\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.672894 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-zj6r5"] Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.677329 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7790b39b-8f2f-4637-8514-497457777b14-metrics-certs\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.697386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjct7\" (UniqueName: \"kubernetes.io/projected/7790b39b-8f2f-4637-8514-497457777b14-kube-api-access-pjct7\") pod \"frr-k8s-p9xxd\" (UID: \"7790b39b-8f2f-4637-8514-497457777b14\") " pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.699476 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxdts\" (UniqueName: \"kubernetes.io/projected/14bf1d46-7291-4940-9d53-9361142ad142-kube-api-access-lxdts\") pod \"frr-k8s-webhook-server-7fcb986d4-7vgpc\" (UID: \"14bf1d46-7291-4940-9d53-9361142ad142\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760114 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/defc2342-4163-4319-afaa-fa2eb042082c-metallb-excludel2\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760199 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bw44\" (UniqueName: \"kubernetes.io/projected/d016272b-85ec-410a-9392-050c9c0a5ff1-kube-api-access-5bw44\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760285 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-metrics-certs\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: E1202 07:58:55.760388 4691 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760418 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c26r\" (UniqueName: \"kubernetes.io/projected/defc2342-4163-4319-afaa-fa2eb042082c-kube-api-access-9c26r\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: E1202 07:58:55.760461 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist podName:defc2342-4163-4319-afaa-fa2eb042082c nodeName:}" failed. No retries permitted until 2025-12-02 07:58:56.260441786 +0000 UTC m=+784.044520648 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist") pod "speaker-bv29g" (UID: "defc2342-4163-4319-afaa-fa2eb042082c") : secret "metallb-memberlist" not found Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760510 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d016272b-85ec-410a-9392-050c9c0a5ff1-metrics-certs\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: E1202 07:58:55.760515 4691 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 02 07:58:55 crc kubenswrapper[4691]: E1202 07:58:55.760617 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-metrics-certs podName:defc2342-4163-4319-afaa-fa2eb042082c nodeName:}" failed. No retries permitted until 2025-12-02 07:58:56.26059371 +0000 UTC m=+784.044672572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-metrics-certs") pod "speaker-bv29g" (UID: "defc2342-4163-4319-afaa-fa2eb042082c") : secret "speaker-certs-secret" not found Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.760538 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d016272b-85ec-410a-9392-050c9c0a5ff1-cert\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.761579 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/defc2342-4163-4319-afaa-fa2eb042082c-metallb-excludel2\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.778117 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c26r\" (UniqueName: \"kubernetes.io/projected/defc2342-4163-4319-afaa-fa2eb042082c-kube-api-access-9c26r\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.861993 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bw44\" (UniqueName: \"kubernetes.io/projected/d016272b-85ec-410a-9392-050c9c0a5ff1-kube-api-access-5bw44\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.862117 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d016272b-85ec-410a-9392-050c9c0a5ff1-metrics-certs\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.862137 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d016272b-85ec-410a-9392-050c9c0a5ff1-cert\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.863944 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.864268 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.865559 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d016272b-85ec-410a-9392-050c9c0a5ff1-metrics-certs\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.873446 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.877428 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d016272b-85ec-410a-9392-050c9c0a5ff1-cert\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:55 crc kubenswrapper[4691]: I1202 07:58:55.879383 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bw44\" (UniqueName: \"kubernetes.io/projected/d016272b-85ec-410a-9392-050c9c0a5ff1-kube-api-access-5bw44\") pod \"controller-f8648f98b-zj6r5\" (UID: \"d016272b-85ec-410a-9392-050c9c0a5ff1\") " pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.046502 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.412377 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.412432 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-metrics-certs\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:56 crc kubenswrapper[4691]: E1202 07:58:56.412991 4691 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 02 07:58:56 crc kubenswrapper[4691]: E1202 07:58:56.413072 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist podName:defc2342-4163-4319-afaa-fa2eb042082c nodeName:}" failed. No retries permitted until 2025-12-02 07:58:57.413053476 +0000 UTC m=+785.197132338 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist") pod "speaker-bv29g" (UID: "defc2342-4163-4319-afaa-fa2eb042082c") : secret "metallb-memberlist" not found Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.420629 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-metrics-certs\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.441197 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc"] Dec 02 07:58:56 crc kubenswrapper[4691]: W1202 07:58:56.452941 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14bf1d46_7291_4940_9d53_9361142ad142.slice/crio-a45f7a7e158fb9afe7f5a73fde8fb05ea961d85dc2e6ec02b6c2d0b89ad7fdb5 WatchSource:0}: Error finding container a45f7a7e158fb9afe7f5a73fde8fb05ea961d85dc2e6ec02b6c2d0b89ad7fdb5: Status 404 returned error can't find the container with id a45f7a7e158fb9afe7f5a73fde8fb05ea961d85dc2e6ec02b6c2d0b89ad7fdb5 Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.530140 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"e5a59ea05e2c092b806145b195773473a1816d20fd61681c78c2a31bb0ef2448"} Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.531182 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" event={"ID":"14bf1d46-7291-4940-9d53-9361142ad142","Type":"ContainerStarted","Data":"a45f7a7e158fb9afe7f5a73fde8fb05ea961d85dc2e6ec02b6c2d0b89ad7fdb5"} Dec 02 07:58:56 crc kubenswrapper[4691]: I1202 07:58:56.590226 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-zj6r5"] Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.426542 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.432811 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/defc2342-4163-4319-afaa-fa2eb042082c-memberlist\") pod \"speaker-bv29g\" (UID: \"defc2342-4163-4319-afaa-fa2eb042082c\") " pod="metallb-system/speaker-bv29g" Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.471168 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-bv29g" Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.539033 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-zj6r5" event={"ID":"d016272b-85ec-410a-9392-050c9c0a5ff1","Type":"ContainerStarted","Data":"f8b2e69a606f34f7fa2867e300c63ad3817b343dc4e90cc8a5ebe5c54c4907dc"} Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.539084 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-zj6r5" event={"ID":"d016272b-85ec-410a-9392-050c9c0a5ff1","Type":"ContainerStarted","Data":"c633e378a1d3c249df638318ab0fe12137529e80f556ef90e61b533daeb8ae86"} Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.539100 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-zj6r5" event={"ID":"d016272b-85ec-410a-9392-050c9c0a5ff1","Type":"ContainerStarted","Data":"aa3055001a19e5abc2f866286a23c2785b1be608be7cabc8cf92d5175b3be53d"} Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.540197 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:58:57 crc kubenswrapper[4691]: I1202 07:58:57.541392 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv29g" event={"ID":"defc2342-4163-4319-afaa-fa2eb042082c","Type":"ContainerStarted","Data":"dc7db9bc299178428963e1a2563e0c16767c27f9f2d1722f489d435cb5c9d691"} Dec 02 07:58:58 crc kubenswrapper[4691]: I1202 07:58:58.565796 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv29g" event={"ID":"defc2342-4163-4319-afaa-fa2eb042082c","Type":"ContainerStarted","Data":"685be24572680d4707ce006221f8f790792c86c4f8da607eca47cf05abaff59f"} Dec 02 07:58:58 crc kubenswrapper[4691]: I1202 07:58:58.581387 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-bv29g" event={"ID":"defc2342-4163-4319-afaa-fa2eb042082c","Type":"ContainerStarted","Data":"d84d6da4cda44928e215eac98570b572deb6bc17379391c5a12c12d6bdf4ad37"} Dec 02 07:58:58 crc kubenswrapper[4691]: I1202 07:58:58.581449 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-bv29g" Dec 02 07:58:58 crc kubenswrapper[4691]: I1202 07:58:58.593675 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-zj6r5" podStartSLOduration=3.593654184 podStartE2EDuration="3.593654184s" podCreationTimestamp="2025-12-02 07:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:58:57.570142716 +0000 UTC m=+785.354221588" watchObservedRunningTime="2025-12-02 07:58:58.593654184 +0000 UTC m=+786.377733046" Dec 02 07:58:58 crc kubenswrapper[4691]: I1202 07:58:58.596486 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-bv29g" podStartSLOduration=3.596479182 podStartE2EDuration="3.596479182s" podCreationTimestamp="2025-12-02 07:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 07:58:58.5901827 +0000 UTC m=+786.374261572" watchObservedRunningTime="2025-12-02 07:58:58.596479182 +0000 UTC m=+786.380558044" Dec 02 07:59:06 crc kubenswrapper[4691]: I1202 07:59:06.050928 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-zj6r5" Dec 02 07:59:06 crc kubenswrapper[4691]: I1202 07:59:06.667134 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" event={"ID":"14bf1d46-7291-4940-9d53-9361142ad142","Type":"ContainerStarted","Data":"b2d35d17b69c01eac31299040b536f2c4065c22c26e14458b2f0c3e3d05774dc"} Dec 02 07:59:06 crc kubenswrapper[4691]: I1202 07:59:06.667358 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:59:06 crc kubenswrapper[4691]: I1202 07:59:06.669808 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerDied","Data":"8117b773350e7a94290aa141628bdb56c708015618a80ddc4175ee79668856e4"} Dec 02 07:59:06 crc kubenswrapper[4691]: I1202 07:59:06.669811 4691 generic.go:334] "Generic (PLEG): container finished" podID="7790b39b-8f2f-4637-8514-497457777b14" containerID="8117b773350e7a94290aa141628bdb56c708015618a80ddc4175ee79668856e4" exitCode=0 Dec 02 07:59:06 crc kubenswrapper[4691]: I1202 07:59:06.694379 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" podStartSLOduration=2.6104938090000003 podStartE2EDuration="11.694352927s" podCreationTimestamp="2025-12-02 07:58:55 +0000 UTC" firstStartedPulling="2025-12-02 07:58:56.45548569 +0000 UTC m=+784.239564552" lastFinishedPulling="2025-12-02 07:59:05.539344808 +0000 UTC m=+793.323423670" observedRunningTime="2025-12-02 07:59:06.689799108 +0000 UTC m=+794.473877970" watchObservedRunningTime="2025-12-02 07:59:06.694352927 +0000 UTC m=+794.478431789" Dec 02 07:59:07 crc kubenswrapper[4691]: I1202 07:59:07.475722 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-bv29g" Dec 02 07:59:07 crc kubenswrapper[4691]: I1202 07:59:07.676636 4691 generic.go:334] "Generic (PLEG): container finished" podID="7790b39b-8f2f-4637-8514-497457777b14" containerID="0c7aa4acd2bb0a4dd67dbbbde1675654ee34ef31f2e660d58fd18b2a772ab711" exitCode=0 Dec 02 07:59:07 crc kubenswrapper[4691]: I1202 07:59:07.676691 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerDied","Data":"0c7aa4acd2bb0a4dd67dbbbde1675654ee34ef31f2e660d58fd18b2a772ab711"} Dec 02 07:59:08 crc kubenswrapper[4691]: I1202 07:59:08.685346 4691 generic.go:334] "Generic (PLEG): container finished" podID="7790b39b-8f2f-4637-8514-497457777b14" containerID="457051c14657eef699e811932c86b3649771741e4eb13f0e6aab77f3fe10f81a" exitCode=0 Dec 02 07:59:08 crc kubenswrapper[4691]: I1202 07:59:08.685392 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerDied","Data":"457051c14657eef699e811932c86b3649771741e4eb13f0e6aab77f3fe10f81a"} Dec 02 07:59:09 crc kubenswrapper[4691]: I1202 07:59:09.693639 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"20319ce9ccf1d81dc2b2ed28e1ad9e2f3e38884bb3d6d07099a3afb4788452ef"} Dec 02 07:59:09 crc kubenswrapper[4691]: I1202 07:59:09.693984 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"a5e69984624c598447ead007c2c068f563af3d3a3f549f14cb0fd9b9578b5c6a"} Dec 02 07:59:09 crc kubenswrapper[4691]: I1202 07:59:09.693999 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"42796db336132c005195f6f880ac053cdd2a284d526f277fdee15f23493c38e6"} Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.721347 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"0bf78e8ec20f0e1f019b5f4a28fb0fb9e1d14dbfbfb742a0c6419dacf3b604a3"} Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.721914 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"2c72d8fce2907ae56234e7617ddd2a9a8d5ce7c7438072c39ace24667d38ac30"} Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.748108 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-6r7w7"] Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.748941 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.751093 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.751147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dvslw" Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.756905 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.771876 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6r7w7"] Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.857149 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cbpc\" (UniqueName: \"kubernetes.io/projected/439eb0cf-911e-4c98-8803-daf0f0fccf4d-kube-api-access-9cbpc\") pod \"openstack-operator-index-6r7w7\" (UID: \"439eb0cf-911e-4c98-8803-daf0f0fccf4d\") " pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.958717 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cbpc\" (UniqueName: \"kubernetes.io/projected/439eb0cf-911e-4c98-8803-daf0f0fccf4d-kube-api-access-9cbpc\") pod \"openstack-operator-index-6r7w7\" (UID: \"439eb0cf-911e-4c98-8803-daf0f0fccf4d\") " pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:10 crc kubenswrapper[4691]: I1202 07:59:10.982496 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cbpc\" (UniqueName: \"kubernetes.io/projected/439eb0cf-911e-4c98-8803-daf0f0fccf4d-kube-api-access-9cbpc\") pod \"openstack-operator-index-6r7w7\" (UID: \"439eb0cf-911e-4c98-8803-daf0f0fccf4d\") " pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:11 crc kubenswrapper[4691]: I1202 07:59:11.069887 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:11 crc kubenswrapper[4691]: I1202 07:59:11.352283 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-6r7w7"] Dec 02 07:59:11 crc kubenswrapper[4691]: I1202 07:59:11.727891 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6r7w7" event={"ID":"439eb0cf-911e-4c98-8803-daf0f0fccf4d","Type":"ContainerStarted","Data":"d84bfe5f211085df79a2fb7ac94eae59fda47f6f3c5968af4d9b2a62f9ee21a9"} Dec 02 07:59:11 crc kubenswrapper[4691]: I1202 07:59:11.733636 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-p9xxd" event={"ID":"7790b39b-8f2f-4637-8514-497457777b14","Type":"ContainerStarted","Data":"4c1ae2e745e87ab06310df2c5f9071620930bb7e1b2ef3a8c91b84fc59774503"} Dec 02 07:59:11 crc kubenswrapper[4691]: I1202 07:59:11.733857 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:59:11 crc kubenswrapper[4691]: I1202 07:59:11.763503 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-p9xxd" podStartSLOduration=7.321857742 podStartE2EDuration="16.76348023s" podCreationTimestamp="2025-12-02 07:58:55 +0000 UTC" firstStartedPulling="2025-12-02 07:58:56.070469283 +0000 UTC m=+783.854548145" lastFinishedPulling="2025-12-02 07:59:05.512091771 +0000 UTC m=+793.296170633" observedRunningTime="2025-12-02 07:59:11.760225442 +0000 UTC m=+799.544304314" watchObservedRunningTime="2025-12-02 07:59:11.76348023 +0000 UTC m=+799.547559092" Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.122453 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6r7w7"] Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.723005 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f2vhc"] Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.725356 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.734510 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f2vhc"] Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.772240 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6r7w7" event={"ID":"439eb0cf-911e-4c98-8803-daf0f0fccf4d","Type":"ContainerStarted","Data":"571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6"} Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.772570 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-6r7w7" podUID="439eb0cf-911e-4c98-8803-daf0f0fccf4d" containerName="registry-server" containerID="cri-o://571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6" gracePeriod=2 Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.793958 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-6r7w7" podStartSLOduration=2.059404669 podStartE2EDuration="4.793939407s" podCreationTimestamp="2025-12-02 07:59:10 +0000 UTC" firstStartedPulling="2025-12-02 07:59:11.364241611 +0000 UTC m=+799.148320473" lastFinishedPulling="2025-12-02 07:59:14.098776349 +0000 UTC m=+801.882855211" observedRunningTime="2025-12-02 07:59:14.790889473 +0000 UTC m=+802.574968335" watchObservedRunningTime="2025-12-02 07:59:14.793939407 +0000 UTC m=+802.578018269" Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.817632 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psbbk\" (UniqueName: \"kubernetes.io/projected/22ecfc20-6eb6-417f-ab17-8fc55057d5af-kube-api-access-psbbk\") pod \"openstack-operator-index-f2vhc\" (UID: \"22ecfc20-6eb6-417f-ab17-8fc55057d5af\") " pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.919175 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psbbk\" (UniqueName: \"kubernetes.io/projected/22ecfc20-6eb6-417f-ab17-8fc55057d5af-kube-api-access-psbbk\") pod \"openstack-operator-index-f2vhc\" (UID: \"22ecfc20-6eb6-417f-ab17-8fc55057d5af\") " pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:14 crc kubenswrapper[4691]: I1202 07:59:14.938160 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psbbk\" (UniqueName: \"kubernetes.io/projected/22ecfc20-6eb6-417f-ab17-8fc55057d5af-kube-api-access-psbbk\") pod \"openstack-operator-index-f2vhc\" (UID: \"22ecfc20-6eb6-417f-ab17-8fc55057d5af\") " pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.083036 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.107915 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.223743 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cbpc\" (UniqueName: \"kubernetes.io/projected/439eb0cf-911e-4c98-8803-daf0f0fccf4d-kube-api-access-9cbpc\") pod \"439eb0cf-911e-4c98-8803-daf0f0fccf4d\" (UID: \"439eb0cf-911e-4c98-8803-daf0f0fccf4d\") " Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.243922 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/439eb0cf-911e-4c98-8803-daf0f0fccf4d-kube-api-access-9cbpc" (OuterVolumeSpecName: "kube-api-access-9cbpc") pod "439eb0cf-911e-4c98-8803-daf0f0fccf4d" (UID: "439eb0cf-911e-4c98-8803-daf0f0fccf4d"). InnerVolumeSpecName "kube-api-access-9cbpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.295374 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f2vhc"] Dec 02 07:59:15 crc kubenswrapper[4691]: W1202 07:59:15.298877 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ecfc20_6eb6_417f_ab17_8fc55057d5af.slice/crio-b78cbf4403310713d3bb0f703ba8dbbb7c7ad1c1171650e6e48953671ef0f160 WatchSource:0}: Error finding container b78cbf4403310713d3bb0f703ba8dbbb7c7ad1c1171650e6e48953671ef0f160: Status 404 returned error can't find the container with id b78cbf4403310713d3bb0f703ba8dbbb7c7ad1c1171650e6e48953671ef0f160 Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.325868 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cbpc\" (UniqueName: \"kubernetes.io/projected/439eb0cf-911e-4c98-8803-daf0f0fccf4d-kube-api-access-9cbpc\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.783300 4691 generic.go:334] "Generic (PLEG): container finished" podID="439eb0cf-911e-4c98-8803-daf0f0fccf4d" containerID="571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6" exitCode=0 Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.783414 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6r7w7" event={"ID":"439eb0cf-911e-4c98-8803-daf0f0fccf4d","Type":"ContainerDied","Data":"571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6"} Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.783453 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-6r7w7" event={"ID":"439eb0cf-911e-4c98-8803-daf0f0fccf4d","Type":"ContainerDied","Data":"d84bfe5f211085df79a2fb7ac94eae59fda47f6f3c5968af4d9b2a62f9ee21a9"} Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.783476 4691 scope.go:117] "RemoveContainer" containerID="571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.784048 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-6r7w7" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.785465 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f2vhc" event={"ID":"22ecfc20-6eb6-417f-ab17-8fc55057d5af","Type":"ContainerStarted","Data":"d0051acd970db5042ed035aff2ce61cd9bde67bb1b58a98fa8f02725962d8301"} Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.785518 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f2vhc" event={"ID":"22ecfc20-6eb6-417f-ab17-8fc55057d5af","Type":"ContainerStarted","Data":"b78cbf4403310713d3bb0f703ba8dbbb7c7ad1c1171650e6e48953671ef0f160"} Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.802864 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f2vhc" podStartSLOduration=1.7586525160000002 podStartE2EDuration="1.802844672s" podCreationTimestamp="2025-12-02 07:59:14 +0000 UTC" firstStartedPulling="2025-12-02 07:59:15.302442862 +0000 UTC m=+803.086521724" lastFinishedPulling="2025-12-02 07:59:15.346635028 +0000 UTC m=+803.130713880" observedRunningTime="2025-12-02 07:59:15.80233143 +0000 UTC m=+803.586410312" watchObservedRunningTime="2025-12-02 07:59:15.802844672 +0000 UTC m=+803.586923534" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.805939 4691 scope.go:117] "RemoveContainer" containerID="571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6" Dec 02 07:59:15 crc kubenswrapper[4691]: E1202 07:59:15.807288 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6\": container with ID starting with 571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6 not found: ID does not exist" containerID="571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.807328 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6"} err="failed to get container status \"571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6\": rpc error: code = NotFound desc = could not find container \"571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6\": container with ID starting with 571db68324b35c079ce7aff062ecbac6ef4e7eb26d5206cc17beba621545ffe6 not found: ID does not exist" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.823247 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-6r7w7"] Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.827449 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-6r7w7"] Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.865035 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.880924 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-7vgpc" Dec 02 07:59:15 crc kubenswrapper[4691]: I1202 07:59:15.908359 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:59:16 crc kubenswrapper[4691]: I1202 07:59:16.568392 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="439eb0cf-911e-4c98-8803-daf0f0fccf4d" path="/var/lib/kubelet/pods/439eb0cf-911e-4c98-8803-daf0f0fccf4d/volumes" Dec 02 07:59:21 crc kubenswrapper[4691]: I1202 07:59:21.898814 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:59:21 crc kubenswrapper[4691]: I1202 07:59:21.899093 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:59:25 crc kubenswrapper[4691]: I1202 07:59:25.083652 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:25 crc kubenswrapper[4691]: I1202 07:59:25.084321 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:25 crc kubenswrapper[4691]: I1202 07:59:25.143323 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:25 crc kubenswrapper[4691]: I1202 07:59:25.869851 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-p9xxd" Dec 02 07:59:25 crc kubenswrapper[4691]: I1202 07:59:25.902398 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f2vhc" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.071675 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h"] Dec 02 07:59:33 crc kubenswrapper[4691]: E1202 07:59:33.072813 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="439eb0cf-911e-4c98-8803-daf0f0fccf4d" containerName="registry-server" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.072830 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="439eb0cf-911e-4c98-8803-daf0f0fccf4d" containerName="registry-server" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.073179 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="439eb0cf-911e-4c98-8803-daf0f0fccf4d" containerName="registry-server" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.075479 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.077516 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-9prvt" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.085812 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h"] Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.181046 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r84lc\" (UniqueName: \"kubernetes.io/projected/337754ee-e2dc-4a26-84f4-6010c0f73133-kube-api-access-r84lc\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.181412 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-util\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.181592 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-bundle\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.283843 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-util\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.283921 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-bundle\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.283974 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r84lc\" (UniqueName: \"kubernetes.io/projected/337754ee-e2dc-4a26-84f4-6010c0f73133-kube-api-access-r84lc\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.284562 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-util\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.284617 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-bundle\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.306859 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r84lc\" (UniqueName: \"kubernetes.io/projected/337754ee-e2dc-4a26-84f4-6010c0f73133-kube-api-access-r84lc\") pod \"eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.406744 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.865465 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h"] Dec 02 07:59:33 crc kubenswrapper[4691]: I1202 07:59:33.906498 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" event={"ID":"337754ee-e2dc-4a26-84f4-6010c0f73133","Type":"ContainerStarted","Data":"218b89e5ae6a8a02c883df34cd67c77c7d8f91e769ad9d56ac55eb69673c2b30"} Dec 02 07:59:34 crc kubenswrapper[4691]: I1202 07:59:34.913909 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" event={"ID":"337754ee-e2dc-4a26-84f4-6010c0f73133","Type":"ContainerStarted","Data":"3c841973239ec260c42908c2722b40845fdf3e84f0ea2bf0db5da274b693404f"} Dec 02 07:59:35 crc kubenswrapper[4691]: I1202 07:59:35.925344 4691 generic.go:334] "Generic (PLEG): container finished" podID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerID="3c841973239ec260c42908c2722b40845fdf3e84f0ea2bf0db5da274b693404f" exitCode=0 Dec 02 07:59:35 crc kubenswrapper[4691]: I1202 07:59:35.925419 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" event={"ID":"337754ee-e2dc-4a26-84f4-6010c0f73133","Type":"ContainerDied","Data":"3c841973239ec260c42908c2722b40845fdf3e84f0ea2bf0db5da274b693404f"} Dec 02 07:59:36 crc kubenswrapper[4691]: I1202 07:59:36.936099 4691 generic.go:334] "Generic (PLEG): container finished" podID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerID="6527ca1f09aca1ee2c5d97fa8e4c0918a10e2dc9c1187667e5d510fb2eb0d347" exitCode=0 Dec 02 07:59:36 crc kubenswrapper[4691]: I1202 07:59:36.936207 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" event={"ID":"337754ee-e2dc-4a26-84f4-6010c0f73133","Type":"ContainerDied","Data":"6527ca1f09aca1ee2c5d97fa8e4c0918a10e2dc9c1187667e5d510fb2eb0d347"} Dec 02 07:59:37 crc kubenswrapper[4691]: I1202 07:59:37.948809 4691 generic.go:334] "Generic (PLEG): container finished" podID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerID="8d7646cb3eaeba7ab96c7f09cffc26ee41f8e227a26a8dc0a17d2fc15efed0cd" exitCode=0 Dec 02 07:59:37 crc kubenswrapper[4691]: I1202 07:59:37.948881 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" event={"ID":"337754ee-e2dc-4a26-84f4-6010c0f73133","Type":"ContainerDied","Data":"8d7646cb3eaeba7ab96c7f09cffc26ee41f8e227a26a8dc0a17d2fc15efed0cd"} Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.201485 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.344126 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-util\") pod \"337754ee-e2dc-4a26-84f4-6010c0f73133\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.344231 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r84lc\" (UniqueName: \"kubernetes.io/projected/337754ee-e2dc-4a26-84f4-6010c0f73133-kube-api-access-r84lc\") pod \"337754ee-e2dc-4a26-84f4-6010c0f73133\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.344353 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-bundle\") pod \"337754ee-e2dc-4a26-84f4-6010c0f73133\" (UID: \"337754ee-e2dc-4a26-84f4-6010c0f73133\") " Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.345645 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-bundle" (OuterVolumeSpecName: "bundle") pod "337754ee-e2dc-4a26-84f4-6010c0f73133" (UID: "337754ee-e2dc-4a26-84f4-6010c0f73133"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.351870 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337754ee-e2dc-4a26-84f4-6010c0f73133-kube-api-access-r84lc" (OuterVolumeSpecName: "kube-api-access-r84lc") pod "337754ee-e2dc-4a26-84f4-6010c0f73133" (UID: "337754ee-e2dc-4a26-84f4-6010c0f73133"). InnerVolumeSpecName "kube-api-access-r84lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.357989 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-util" (OuterVolumeSpecName: "util") pod "337754ee-e2dc-4a26-84f4-6010c0f73133" (UID: "337754ee-e2dc-4a26-84f4-6010c0f73133"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.446140 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-util\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.446177 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r84lc\" (UniqueName: \"kubernetes.io/projected/337754ee-e2dc-4a26-84f4-6010c0f73133-kube-api-access-r84lc\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.446190 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/337754ee-e2dc-4a26-84f4-6010c0f73133-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.966297 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" event={"ID":"337754ee-e2dc-4a26-84f4-6010c0f73133","Type":"ContainerDied","Data":"218b89e5ae6a8a02c883df34cd67c77c7d8f91e769ad9d56ac55eb69673c2b30"} Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.966692 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="218b89e5ae6a8a02c883df34cd67c77c7d8f91e769ad9d56ac55eb69673c2b30" Dec 02 07:59:39 crc kubenswrapper[4691]: I1202 07:59:39.966394 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.132236 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5"] Dec 02 07:59:45 crc kubenswrapper[4691]: E1202 07:59:45.132548 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="util" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.132563 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="util" Dec 02 07:59:45 crc kubenswrapper[4691]: E1202 07:59:45.132580 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="extract" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.132586 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="extract" Dec 02 07:59:45 crc kubenswrapper[4691]: E1202 07:59:45.132600 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="pull" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.132606 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="pull" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.132728 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="337754ee-e2dc-4a26-84f4-6010c0f73133" containerName="extract" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.133253 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.136660 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-mtmvd" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.164087 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5"] Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.231712 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bjmp\" (UniqueName: \"kubernetes.io/projected/e36ff06e-9c96-433a-88fc-14c6941566ee-kube-api-access-7bjmp\") pod \"openstack-operator-controller-operator-7f9cf644cb-bqpg5\" (UID: \"e36ff06e-9c96-433a-88fc-14c6941566ee\") " pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.333398 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bjmp\" (UniqueName: \"kubernetes.io/projected/e36ff06e-9c96-433a-88fc-14c6941566ee-kube-api-access-7bjmp\") pod \"openstack-operator-controller-operator-7f9cf644cb-bqpg5\" (UID: \"e36ff06e-9c96-433a-88fc-14c6941566ee\") " pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.357436 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bjmp\" (UniqueName: \"kubernetes.io/projected/e36ff06e-9c96-433a-88fc-14c6941566ee-kube-api-access-7bjmp\") pod \"openstack-operator-controller-operator-7f9cf644cb-bqpg5\" (UID: \"e36ff06e-9c96-433a-88fc-14c6941566ee\") " pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.454095 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 07:59:45 crc kubenswrapper[4691]: I1202 07:59:45.934839 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5"] Dec 02 07:59:45 crc kubenswrapper[4691]: W1202 07:59:45.939918 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode36ff06e_9c96_433a_88fc_14c6941566ee.slice/crio-d653bcfd4b36ad684fe8331d1de07377f16e5e23fa25fac2aea1485134bafa72 WatchSource:0}: Error finding container d653bcfd4b36ad684fe8331d1de07377f16e5e23fa25fac2aea1485134bafa72: Status 404 returned error can't find the container with id d653bcfd4b36ad684fe8331d1de07377f16e5e23fa25fac2aea1485134bafa72 Dec 02 07:59:46 crc kubenswrapper[4691]: I1202 07:59:46.007453 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" event={"ID":"e36ff06e-9c96-433a-88fc-14c6941566ee","Type":"ContainerStarted","Data":"d653bcfd4b36ad684fe8331d1de07377f16e5e23fa25fac2aea1485134bafa72"} Dec 02 07:59:51 crc kubenswrapper[4691]: I1202 07:59:51.898178 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 07:59:51 crc kubenswrapper[4691]: I1202 07:59:51.898803 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 07:59:51 crc kubenswrapper[4691]: I1202 07:59:51.898993 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 07:59:51 crc kubenswrapper[4691]: I1202 07:59:51.899671 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49c546328dbd8547e0ff1dcfee99f503a31e4448db0773f3ebd91ead3aa35f8b"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 07:59:51 crc kubenswrapper[4691]: I1202 07:59:51.899786 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://49c546328dbd8547e0ff1dcfee99f503a31e4448db0773f3ebd91ead3aa35f8b" gracePeriod=600 Dec 02 07:59:52 crc kubenswrapper[4691]: I1202 07:59:52.106145 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="49c546328dbd8547e0ff1dcfee99f503a31e4448db0773f3ebd91ead3aa35f8b" exitCode=0 Dec 02 07:59:52 crc kubenswrapper[4691]: I1202 07:59:52.106181 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"49c546328dbd8547e0ff1dcfee99f503a31e4448db0773f3ebd91ead3aa35f8b"} Dec 02 07:59:52 crc kubenswrapper[4691]: I1202 07:59:52.106228 4691 scope.go:117] "RemoveContainer" containerID="a16996ecf2c98a339a05624da8a98affe6240ab365ac144c76fc73906ae11b70" Dec 02 07:59:53 crc kubenswrapper[4691]: I1202 07:59:53.115015 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" event={"ID":"e36ff06e-9c96-433a-88fc-14c6941566ee","Type":"ContainerStarted","Data":"00cd102ee3e53e8c78b8718bae05dd66d7ee29526350af1ce1b445320a3f8ba6"} Dec 02 07:59:53 crc kubenswrapper[4691]: I1202 07:59:53.115569 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 07:59:53 crc kubenswrapper[4691]: I1202 07:59:53.119407 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"29b37d8d63090a5b29435fd6f341a26e6433431bf7160b686913291b1dd9efc2"} Dec 02 07:59:53 crc kubenswrapper[4691]: I1202 07:59:53.178665 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" podStartSLOduration=1.409973471 podStartE2EDuration="8.178646588s" podCreationTimestamp="2025-12-02 07:59:45 +0000 UTC" firstStartedPulling="2025-12-02 07:59:45.942585279 +0000 UTC m=+833.726664141" lastFinishedPulling="2025-12-02 07:59:52.711258396 +0000 UTC m=+840.495337258" observedRunningTime="2025-12-02 07:59:53.154827864 +0000 UTC m=+840.938906726" watchObservedRunningTime="2025-12-02 07:59:53.178646588 +0000 UTC m=+840.962725450" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.151995 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8"] Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.153553 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.156372 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.157054 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.165573 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8"] Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.210828 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcvmg\" (UniqueName: \"kubernetes.io/projected/c14266eb-cdfd-4665-a26b-f545bcbdff7d-kube-api-access-fcvmg\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.210916 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c14266eb-cdfd-4665-a26b-f545bcbdff7d-secret-volume\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.211090 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c14266eb-cdfd-4665-a26b-f545bcbdff7d-config-volume\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.312702 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c14266eb-cdfd-4665-a26b-f545bcbdff7d-secret-volume\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.313350 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c14266eb-cdfd-4665-a26b-f545bcbdff7d-config-volume\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.313522 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcvmg\" (UniqueName: \"kubernetes.io/projected/c14266eb-cdfd-4665-a26b-f545bcbdff7d-kube-api-access-fcvmg\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.314609 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c14266eb-cdfd-4665-a26b-f545bcbdff7d-config-volume\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.322705 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c14266eb-cdfd-4665-a26b-f545bcbdff7d-secret-volume\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.334444 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcvmg\" (UniqueName: \"kubernetes.io/projected/c14266eb-cdfd-4665-a26b-f545bcbdff7d-kube-api-access-fcvmg\") pod \"collect-profiles-29411040-tdpp8\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:00 crc kubenswrapper[4691]: I1202 08:00:00.472587 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:01 crc kubenswrapper[4691]: I1202 08:00:01.035738 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8"] Dec 02 08:00:01 crc kubenswrapper[4691]: I1202 08:00:01.171022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" event={"ID":"c14266eb-cdfd-4665-a26b-f545bcbdff7d","Type":"ContainerStarted","Data":"26dfcd7450424e71af37397c7e085a87ee464bea0d9be167e6b936eb0eef7827"} Dec 02 08:00:02 crc kubenswrapper[4691]: I1202 08:00:02.178282 4691 generic.go:334] "Generic (PLEG): container finished" podID="c14266eb-cdfd-4665-a26b-f545bcbdff7d" containerID="cf02052188f0ab04cd72818b1ca0b7f542f6428abd5b10ddff25c96810ded2ea" exitCode=0 Dec 02 08:00:02 crc kubenswrapper[4691]: I1202 08:00:02.178331 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" event={"ID":"c14266eb-cdfd-4665-a26b-f545bcbdff7d","Type":"ContainerDied","Data":"cf02052188f0ab04cd72818b1ca0b7f542f6428abd5b10ddff25c96810ded2ea"} Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.940125 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.974419 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcvmg\" (UniqueName: \"kubernetes.io/projected/c14266eb-cdfd-4665-a26b-f545bcbdff7d-kube-api-access-fcvmg\") pod \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.974505 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c14266eb-cdfd-4665-a26b-f545bcbdff7d-config-volume\") pod \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.974584 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c14266eb-cdfd-4665-a26b-f545bcbdff7d-secret-volume\") pod \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\" (UID: \"c14266eb-cdfd-4665-a26b-f545bcbdff7d\") " Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.975318 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c14266eb-cdfd-4665-a26b-f545bcbdff7d-config-volume" (OuterVolumeSpecName: "config-volume") pod "c14266eb-cdfd-4665-a26b-f545bcbdff7d" (UID: "c14266eb-cdfd-4665-a26b-f545bcbdff7d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.979437 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c14266eb-cdfd-4665-a26b-f545bcbdff7d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c14266eb-cdfd-4665-a26b-f545bcbdff7d" (UID: "c14266eb-cdfd-4665-a26b-f545bcbdff7d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:00:03 crc kubenswrapper[4691]: I1202 08:00:03.979430 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c14266eb-cdfd-4665-a26b-f545bcbdff7d-kube-api-access-fcvmg" (OuterVolumeSpecName: "kube-api-access-fcvmg") pod "c14266eb-cdfd-4665-a26b-f545bcbdff7d" (UID: "c14266eb-cdfd-4665-a26b-f545bcbdff7d"). InnerVolumeSpecName "kube-api-access-fcvmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:00:04 crc kubenswrapper[4691]: I1202 08:00:04.075740 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c14266eb-cdfd-4665-a26b-f545bcbdff7d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:04 crc kubenswrapper[4691]: I1202 08:00:04.075808 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c14266eb-cdfd-4665-a26b-f545bcbdff7d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:04 crc kubenswrapper[4691]: I1202 08:00:04.075820 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcvmg\" (UniqueName: \"kubernetes.io/projected/c14266eb-cdfd-4665-a26b-f545bcbdff7d-kube-api-access-fcvmg\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:04 crc kubenswrapper[4691]: I1202 08:00:04.203248 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" event={"ID":"c14266eb-cdfd-4665-a26b-f545bcbdff7d","Type":"ContainerDied","Data":"26dfcd7450424e71af37397c7e085a87ee464bea0d9be167e6b936eb0eef7827"} Dec 02 08:00:04 crc kubenswrapper[4691]: I1202 08:00:04.203302 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26dfcd7450424e71af37397c7e085a87ee464bea0d9be167e6b936eb0eef7827" Dec 02 08:00:04 crc kubenswrapper[4691]: I1202 08:00:04.203325 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8" Dec 02 08:00:05 crc kubenswrapper[4691]: I1202 08:00:05.457246 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-7f9cf644cb-bqpg5" Dec 02 08:00:07 crc kubenswrapper[4691]: I1202 08:00:07.913342 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n76c6"] Dec 02 08:00:07 crc kubenswrapper[4691]: E1202 08:00:07.914665 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c14266eb-cdfd-4665-a26b-f545bcbdff7d" containerName="collect-profiles" Dec 02 08:00:07 crc kubenswrapper[4691]: I1202 08:00:07.914809 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c14266eb-cdfd-4665-a26b-f545bcbdff7d" containerName="collect-profiles" Dec 02 08:00:07 crc kubenswrapper[4691]: I1202 08:00:07.915046 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c14266eb-cdfd-4665-a26b-f545bcbdff7d" containerName="collect-profiles" Dec 02 08:00:07 crc kubenswrapper[4691]: I1202 08:00:07.916200 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:07 crc kubenswrapper[4691]: I1202 08:00:07.924582 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n76c6"] Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.027752 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-catalog-content\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.027855 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5l84\" (UniqueName: \"kubernetes.io/projected/11cbf648-78c5-42d5-8549-b44d6f6f9a46-kube-api-access-s5l84\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.028291 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-utilities\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.129999 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-catalog-content\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.130311 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5l84\" (UniqueName: \"kubernetes.io/projected/11cbf648-78c5-42d5-8549-b44d6f6f9a46-kube-api-access-s5l84\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.130465 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-utilities\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.130653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-catalog-content\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.130988 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-utilities\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.151316 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5l84\" (UniqueName: \"kubernetes.io/projected/11cbf648-78c5-42d5-8549-b44d6f6f9a46-kube-api-access-s5l84\") pod \"redhat-marketplace-n76c6\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.232480 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:08 crc kubenswrapper[4691]: I1202 08:00:08.519364 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n76c6"] Dec 02 08:00:08 crc kubenswrapper[4691]: W1202 08:00:08.526070 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11cbf648_78c5_42d5_8549_b44d6f6f9a46.slice/crio-1e52de77d2b0d04003bdab544293b892ace1e2184cdee8c3424ebd13033caaad WatchSource:0}: Error finding container 1e52de77d2b0d04003bdab544293b892ace1e2184cdee8c3424ebd13033caaad: Status 404 returned error can't find the container with id 1e52de77d2b0d04003bdab544293b892ace1e2184cdee8c3424ebd13033caaad Dec 02 08:00:09 crc kubenswrapper[4691]: I1202 08:00:09.255035 4691 generic.go:334] "Generic (PLEG): container finished" podID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerID="455db217323e23dd85bd12a74546acc77ddc1ec8bcdfa90d5f362a644a782646" exitCode=0 Dec 02 08:00:09 crc kubenswrapper[4691]: I1202 08:00:09.255217 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerDied","Data":"455db217323e23dd85bd12a74546acc77ddc1ec8bcdfa90d5f362a644a782646"} Dec 02 08:00:09 crc kubenswrapper[4691]: I1202 08:00:09.255336 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerStarted","Data":"1e52de77d2b0d04003bdab544293b892ace1e2184cdee8c3424ebd13033caaad"} Dec 02 08:00:11 crc kubenswrapper[4691]: I1202 08:00:11.268381 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerStarted","Data":"a169134bab1e7f720516d7f617051a7a6ca3e95905db9a1d89822b453c674cac"} Dec 02 08:00:12 crc kubenswrapper[4691]: I1202 08:00:12.276091 4691 generic.go:334] "Generic (PLEG): container finished" podID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerID="a169134bab1e7f720516d7f617051a7a6ca3e95905db9a1d89822b453c674cac" exitCode=0 Dec 02 08:00:12 crc kubenswrapper[4691]: I1202 08:00:12.276801 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerDied","Data":"a169134bab1e7f720516d7f617051a7a6ca3e95905db9a1d89822b453c674cac"} Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.350844 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mx8j9"] Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.352327 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.417804 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx8j9"] Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.446598 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-utilities\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.446659 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-catalog-content\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.446726 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwvp\" (UniqueName: \"kubernetes.io/projected/acde365e-9d1d-494f-837f-8365c0be9034-kube-api-access-nxwvp\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.548114 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwvp\" (UniqueName: \"kubernetes.io/projected/acde365e-9d1d-494f-837f-8365c0be9034-kube-api-access-nxwvp\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.548229 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-utilities\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.548280 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-catalog-content\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.548862 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-catalog-content\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.549491 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-utilities\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.588206 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwvp\" (UniqueName: \"kubernetes.io/projected/acde365e-9d1d-494f-837f-8365c0be9034-kube-api-access-nxwvp\") pod \"community-operators-mx8j9\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:14 crc kubenswrapper[4691]: I1202 08:00:14.693225 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:15 crc kubenswrapper[4691]: I1202 08:00:15.636387 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mx8j9"] Dec 02 08:00:16 crc kubenswrapper[4691]: I1202 08:00:16.353667 4691 generic.go:334] "Generic (PLEG): container finished" podID="acde365e-9d1d-494f-837f-8365c0be9034" containerID="84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2" exitCode=0 Dec 02 08:00:16 crc kubenswrapper[4691]: I1202 08:00:16.353792 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerDied","Data":"84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2"} Dec 02 08:00:16 crc kubenswrapper[4691]: I1202 08:00:16.354431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerStarted","Data":"3e81c5caaf1a11bcd64644823751ecfbd57d7a9340acf18e819ccee7d909119b"} Dec 02 08:00:16 crc kubenswrapper[4691]: I1202 08:00:16.357210 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerStarted","Data":"986fbf01ee2bf5afc3ea02367fe50d88b5c2624ac8a1a3ef9eda4fccd29a6f0c"} Dec 02 08:00:16 crc kubenswrapper[4691]: I1202 08:00:16.401459 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n76c6" podStartSLOduration=4.585505784 podStartE2EDuration="9.40144142s" podCreationTimestamp="2025-12-02 08:00:07 +0000 UTC" firstStartedPulling="2025-12-02 08:00:09.257030045 +0000 UTC m=+857.041108907" lastFinishedPulling="2025-12-02 08:00:14.072965681 +0000 UTC m=+861.857044543" observedRunningTime="2025-12-02 08:00:16.398010613 +0000 UTC m=+864.182089475" watchObservedRunningTime="2025-12-02 08:00:16.40144142 +0000 UTC m=+864.185520282" Dec 02 08:00:18 crc kubenswrapper[4691]: I1202 08:00:18.232968 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:18 crc kubenswrapper[4691]: I1202 08:00:18.233266 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:18 crc kubenswrapper[4691]: I1202 08:00:18.570673 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:19 crc kubenswrapper[4691]: I1202 08:00:19.410915 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerStarted","Data":"ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17"} Dec 02 08:00:20 crc kubenswrapper[4691]: I1202 08:00:20.420836 4691 generic.go:334] "Generic (PLEG): container finished" podID="acde365e-9d1d-494f-837f-8365c0be9034" containerID="ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17" exitCode=0 Dec 02 08:00:20 crc kubenswrapper[4691]: I1202 08:00:20.420928 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerDied","Data":"ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17"} Dec 02 08:00:25 crc kubenswrapper[4691]: I1202 08:00:25.488613 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerStarted","Data":"4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29"} Dec 02 08:00:25 crc kubenswrapper[4691]: I1202 08:00:25.520698 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mx8j9" podStartSLOduration=2.926416207 podStartE2EDuration="11.520675531s" podCreationTimestamp="2025-12-02 08:00:14 +0000 UTC" firstStartedPulling="2025-12-02 08:00:16.355113255 +0000 UTC m=+864.139192107" lastFinishedPulling="2025-12-02 08:00:24.949372569 +0000 UTC m=+872.733451431" observedRunningTime="2025-12-02 08:00:25.517726497 +0000 UTC m=+873.301805359" watchObservedRunningTime="2025-12-02 08:00:25.520675531 +0000 UTC m=+873.304754393" Dec 02 08:00:28 crc kubenswrapper[4691]: I1202 08:00:28.309857 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:28 crc kubenswrapper[4691]: I1202 08:00:28.801679 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n76c6"] Dec 02 08:00:28 crc kubenswrapper[4691]: I1202 08:00:28.801917 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n76c6" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="registry-server" containerID="cri-o://986fbf01ee2bf5afc3ea02367fe50d88b5c2624ac8a1a3ef9eda4fccd29a6f0c" gracePeriod=2 Dec 02 08:00:29 crc kubenswrapper[4691]: I1202 08:00:29.583105 4691 generic.go:334] "Generic (PLEG): container finished" podID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerID="986fbf01ee2bf5afc3ea02367fe50d88b5c2624ac8a1a3ef9eda4fccd29a6f0c" exitCode=0 Dec 02 08:00:29 crc kubenswrapper[4691]: I1202 08:00:29.583159 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerDied","Data":"986fbf01ee2bf5afc3ea02367fe50d88b5c2624ac8a1a3ef9eda4fccd29a6f0c"} Dec 02 08:00:29 crc kubenswrapper[4691]: I1202 08:00:29.908583 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.079440 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-utilities\") pod \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.079775 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-catalog-content\") pod \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.079937 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5l84\" (UniqueName: \"kubernetes.io/projected/11cbf648-78c5-42d5-8549-b44d6f6f9a46-kube-api-access-s5l84\") pod \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\" (UID: \"11cbf648-78c5-42d5-8549-b44d6f6f9a46\") " Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.081178 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-utilities" (OuterVolumeSpecName: "utilities") pod "11cbf648-78c5-42d5-8549-b44d6f6f9a46" (UID: "11cbf648-78c5-42d5-8549-b44d6f6f9a46"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.099659 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11cbf648-78c5-42d5-8549-b44d6f6f9a46-kube-api-access-s5l84" (OuterVolumeSpecName: "kube-api-access-s5l84") pod "11cbf648-78c5-42d5-8549-b44d6f6f9a46" (UID: "11cbf648-78c5-42d5-8549-b44d6f6f9a46"). InnerVolumeSpecName "kube-api-access-s5l84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.113539 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11cbf648-78c5-42d5-8549-b44d6f6f9a46" (UID: "11cbf648-78c5-42d5-8549-b44d6f6f9a46"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.181860 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s5l84\" (UniqueName: \"kubernetes.io/projected/11cbf648-78c5-42d5-8549-b44d6f6f9a46-kube-api-access-s5l84\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.181892 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.181902 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cbf648-78c5-42d5-8549-b44d6f6f9a46-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.591736 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n76c6" event={"ID":"11cbf648-78c5-42d5-8549-b44d6f6f9a46","Type":"ContainerDied","Data":"1e52de77d2b0d04003bdab544293b892ace1e2184cdee8c3424ebd13033caaad"} Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.591958 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n76c6" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.592648 4691 scope.go:117] "RemoveContainer" containerID="986fbf01ee2bf5afc3ea02367fe50d88b5c2624ac8a1a3ef9eda4fccd29a6f0c" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.614138 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n76c6"] Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.617359 4691 scope.go:117] "RemoveContainer" containerID="a169134bab1e7f720516d7f617051a7a6ca3e95905db9a1d89822b453c674cac" Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.622050 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n76c6"] Dec 02 08:00:30 crc kubenswrapper[4691]: I1202 08:00:30.645907 4691 scope.go:117] "RemoveContainer" containerID="455db217323e23dd85bd12a74546acc77ddc1ec8bcdfa90d5f362a644a782646" Dec 02 08:00:32 crc kubenswrapper[4691]: I1202 08:00:32.576742 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" path="/var/lib/kubelet/pods/11cbf648-78c5-42d5-8549-b44d6f6f9a46/volumes" Dec 02 08:00:34 crc kubenswrapper[4691]: I1202 08:00:34.694067 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:34 crc kubenswrapper[4691]: I1202 08:00:34.694363 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:34 crc kubenswrapper[4691]: I1202 08:00:34.740364 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.282974 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv"] Dec 02 08:00:35 crc kubenswrapper[4691]: E1202 08:00:35.283342 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="extract-content" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.283363 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="extract-content" Dec 02 08:00:35 crc kubenswrapper[4691]: E1202 08:00:35.283386 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="extract-utilities" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.283395 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="extract-utilities" Dec 02 08:00:35 crc kubenswrapper[4691]: E1202 08:00:35.283414 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="registry-server" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.283422 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="registry-server" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.283574 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="11cbf648-78c5-42d5-8549-b44d6f6f9a46" containerName="registry-server" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.284326 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.288089 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.289130 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.291436 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-khkc6" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.291748 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-lz7jj" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.299717 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.309890 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.314278 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.315305 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.319841 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-pw8xv" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.341250 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.362597 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.369817 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.374639 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-jjv2d" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.412172 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.451602 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87p6r\" (UniqueName: \"kubernetes.io/projected/e88d7782-bcf8-4d40-aa1c-269533471279-kube-api-access-87p6r\") pod \"cinder-operator-controller-manager-859b6ccc6-66zbg\" (UID: \"e88d7782-bcf8-4d40-aa1c-269533471279\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.451685 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7r4\" (UniqueName: \"kubernetes.io/projected/5369081c-2142-4dfa-9482-b8d8d6d4195f-kube-api-access-2l7r4\") pod \"barbican-operator-controller-manager-7d9dfd778-w9hbv\" (UID: \"5369081c-2142-4dfa-9482-b8d8d6d4195f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.451805 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vw5r\" (UniqueName: \"kubernetes.io/projected/9e639d67-2200-474e-9be7-55bef7c97fe6-kube-api-access-9vw5r\") pod \"designate-operator-controller-manager-78b4bc895b-c7ftc\" (UID: \"9e639d67-2200-474e-9be7-55bef7c97fe6\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.451955 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.453288 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.457487 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-tp44t" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.487887 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.492327 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.493538 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.498246 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xt922" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.504642 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.519288 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.520371 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.522930 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-glkhb" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.523225 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.530713 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.548289 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.549553 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.555689 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-w6hjb" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.556134 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.556446 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25rkj\" (UniqueName: \"kubernetes.io/projected/f2666d2b-c30c-40d4-bfab-0e6d00571ecc-kube-api-access-25rkj\") pod \"glance-operator-controller-manager-77987cd8cd-n5f6s\" (UID: \"f2666d2b-c30c-40d4-bfab-0e6d00571ecc\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.556514 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87p6r\" (UniqueName: \"kubernetes.io/projected/e88d7782-bcf8-4d40-aa1c-269533471279-kube-api-access-87p6r\") pod \"cinder-operator-controller-manager-859b6ccc6-66zbg\" (UID: \"e88d7782-bcf8-4d40-aa1c-269533471279\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.556579 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7r4\" (UniqueName: \"kubernetes.io/projected/5369081c-2142-4dfa-9482-b8d8d6d4195f-kube-api-access-2l7r4\") pod \"barbican-operator-controller-manager-7d9dfd778-w9hbv\" (UID: \"5369081c-2142-4dfa-9482-b8d8d6d4195f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.556626 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gggzn\" (UniqueName: \"kubernetes.io/projected/0361226c-435b-4221-b59d-74900b2552e1-kube-api-access-gggzn\") pod \"heat-operator-controller-manager-5f64f6f8bb-qltbw\" (UID: \"0361226c-435b-4221-b59d-74900b2552e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.556721 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vw5r\" (UniqueName: \"kubernetes.io/projected/9e639d67-2200-474e-9be7-55bef7c97fe6-kube-api-access-9vw5r\") pod \"designate-operator-controller-manager-78b4bc895b-c7ftc\" (UID: \"9e639d67-2200-474e-9be7-55bef7c97fe6\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.557381 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.564259 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-wbwjt" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.564868 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.583602 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.584691 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.590277 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.590817 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vbwpb" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.595194 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vw5r\" (UniqueName: \"kubernetes.io/projected/9e639d67-2200-474e-9be7-55bef7c97fe6-kube-api-access-9vw5r\") pod \"designate-operator-controller-manager-78b4bc895b-c7ftc\" (UID: \"9e639d67-2200-474e-9be7-55bef7c97fe6\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.595251 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.596273 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.597016 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7r4\" (UniqueName: \"kubernetes.io/projected/5369081c-2142-4dfa-9482-b8d8d6d4195f-kube-api-access-2l7r4\") pod \"barbican-operator-controller-manager-7d9dfd778-w9hbv\" (UID: \"5369081c-2142-4dfa-9482-b8d8d6d4195f\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.600797 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hqkpq" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.603874 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.607087 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.609702 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87p6r\" (UniqueName: \"kubernetes.io/projected/e88d7782-bcf8-4d40-aa1c-269533471279-kube-api-access-87p6r\") pod \"cinder-operator-controller-manager-859b6ccc6-66zbg\" (UID: \"e88d7782-bcf8-4d40-aa1c-269533471279\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.610850 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.614900 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.616692 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.617384 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-pm4kc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.639614 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.655262 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.658975 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.660296 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.661326 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8rvz\" (UniqueName: \"kubernetes.io/projected/b930ff47-307d-47b3-9b84-54e5860ee2db-kube-api-access-n8rvz\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.661471 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-887qx\" (UniqueName: \"kubernetes.io/projected/759a905a-dc61-4206-862f-cb8b6f85882f-kube-api-access-887qx\") pod \"keystone-operator-controller-manager-7765d96ddf-44twc\" (UID: \"759a905a-dc61-4206-862f-cb8b6f85882f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.661589 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25rkj\" (UniqueName: \"kubernetes.io/projected/f2666d2b-c30c-40d4-bfab-0e6d00571ecc-kube-api-access-25rkj\") pod \"glance-operator-controller-manager-77987cd8cd-n5f6s\" (UID: \"f2666d2b-c30c-40d4-bfab-0e6d00571ecc\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.661724 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48mz\" (UniqueName: \"kubernetes.io/projected/c92e2d03-8848-432f-82f4-fd28b3b0fa34-kube-api-access-q48mz\") pod \"horizon-operator-controller-manager-68c6d99b8f-d898r\" (UID: \"c92e2d03-8848-432f-82f4-fd28b3b0fa34\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.661856 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gggzn\" (UniqueName: \"kubernetes.io/projected/0361226c-435b-4221-b59d-74900b2552e1-kube-api-access-gggzn\") pod \"heat-operator-controller-manager-5f64f6f8bb-qltbw\" (UID: \"0361226c-435b-4221-b59d-74900b2552e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.661963 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxxtx\" (UniqueName: \"kubernetes.io/projected/a1687ef3-9bf5-451e-aa8a-22ede53d9ed9-kube-api-access-vxxtx\") pod \"ironic-operator-controller-manager-6c548fd776-p2dmm\" (UID: \"a1687ef3-9bf5-451e-aa8a-22ede53d9ed9\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.662051 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.663068 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nvqc7" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.664334 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.685897 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.686699 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25rkj\" (UniqueName: \"kubernetes.io/projected/f2666d2b-c30c-40d4-bfab-0e6d00571ecc-kube-api-access-25rkj\") pod \"glance-operator-controller-manager-77987cd8cd-n5f6s\" (UID: \"f2666d2b-c30c-40d4-bfab-0e6d00571ecc\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.686847 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gggzn\" (UniqueName: \"kubernetes.io/projected/0361226c-435b-4221-b59d-74900b2552e1-kube-api-access-gggzn\") pod \"heat-operator-controller-manager-5f64f6f8bb-qltbw\" (UID: \"0361226c-435b-4221-b59d-74900b2552e1\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.698212 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9p97b"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.699396 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.704931 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-rdktn" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.709674 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9p97b"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.718056 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.719788 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.725381 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5fqb9" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.726318 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.728431 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.733910 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.733991 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.734084 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.740497 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-l8rtf" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.740780 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-962rp\" (UniqueName: \"kubernetes.io/projected/0667dbf1-e305-4d12-af7b-3d532a834609-kube-api-access-962rp\") pod \"mariadb-operator-controller-manager-56bbcc9d85-lhkdn\" (UID: \"0667dbf1-e305-4d12-af7b-3d532a834609\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765580 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48mz\" (UniqueName: \"kubernetes.io/projected/c92e2d03-8848-432f-82f4-fd28b3b0fa34-kube-api-access-q48mz\") pod \"horizon-operator-controller-manager-68c6d99b8f-d898r\" (UID: \"c92e2d03-8848-432f-82f4-fd28b3b0fa34\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765633 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxxtx\" (UniqueName: \"kubernetes.io/projected/a1687ef3-9bf5-451e-aa8a-22ede53d9ed9-kube-api-access-vxxtx\") pod \"ironic-operator-controller-manager-6c548fd776-p2dmm\" (UID: \"a1687ef3-9bf5-451e-aa8a-22ede53d9ed9\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765722 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd9rs\" (UniqueName: \"kubernetes.io/projected/36d938bc-e3d6-4f21-8327-5f655a4ef54a-kube-api-access-vd9rs\") pod \"manila-operator-controller-manager-7c79b5df47-28bs2\" (UID: \"36d938bc-e3d6-4f21-8327-5f655a4ef54a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765777 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8rvz\" (UniqueName: \"kubernetes.io/projected/b930ff47-307d-47b3-9b84-54e5860ee2db-kube-api-access-n8rvz\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765827 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4gw6\" (UniqueName: \"kubernetes.io/projected/d82caf6e-e4a7-4474-8cd4-6d3f554ce608-kube-api-access-l4gw6\") pod \"nova-operator-controller-manager-697bc559fc-2g56f\" (UID: \"d82caf6e-e4a7-4474-8cd4-6d3f554ce608\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765860 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hc5l\" (UniqueName: \"kubernetes.io/projected/f4783498-99ba-42cc-9312-8e8c6b279e5a-kube-api-access-5hc5l\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-6tdpt\" (UID: \"f4783498-99ba-42cc-9312-8e8c6b279e5a\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.765883 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-887qx\" (UniqueName: \"kubernetes.io/projected/759a905a-dc61-4206-862f-cb8b6f85882f-kube-api-access-887qx\") pod \"keystone-operator-controller-manager-7765d96ddf-44twc\" (UID: \"759a905a-dc61-4206-862f-cb8b6f85882f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.769816 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v"] Dec 02 08:00:35 crc kubenswrapper[4691]: E1202 08:00:35.770296 4691 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:35 crc kubenswrapper[4691]: E1202 08:00:35.770359 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert podName:b930ff47-307d-47b3-9b84-54e5860ee2db nodeName:}" failed. No retries permitted until 2025-12-02 08:00:36.270338724 +0000 UTC m=+884.054417676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert") pod "infra-operator-controller-manager-57548d458d-n5t7p" (UID: "b930ff47-307d-47b3-9b84-54e5860ee2db") : secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.809541 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.810427 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn"] Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.825180 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxxtx\" (UniqueName: \"kubernetes.io/projected/a1687ef3-9bf5-451e-aa8a-22ede53d9ed9-kube-api-access-vxxtx\") pod \"ironic-operator-controller-manager-6c548fd776-p2dmm\" (UID: \"a1687ef3-9bf5-451e-aa8a-22ede53d9ed9\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.830450 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48mz\" (UniqueName: \"kubernetes.io/projected/c92e2d03-8848-432f-82f4-fd28b3b0fa34-kube-api-access-q48mz\") pod \"horizon-operator-controller-manager-68c6d99b8f-d898r\" (UID: \"c92e2d03-8848-432f-82f4-fd28b3b0fa34\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.835234 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-887qx\" (UniqueName: \"kubernetes.io/projected/759a905a-dc61-4206-862f-cb8b6f85882f-kube-api-access-887qx\") pod \"keystone-operator-controller-manager-7765d96ddf-44twc\" (UID: \"759a905a-dc61-4206-862f-cb8b6f85882f\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.844529 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.852325 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8rvz\" (UniqueName: \"kubernetes.io/projected/b930ff47-307d-47b3-9b84-54e5860ee2db-kube-api-access-n8rvz\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.893303 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-f5f59" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.894392 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.903376 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfgtk\" (UniqueName: \"kubernetes.io/projected/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-kube-api-access-vfgtk\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.903431 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4gw6\" (UniqueName: \"kubernetes.io/projected/d82caf6e-e4a7-4474-8cd4-6d3f554ce608-kube-api-access-l4gw6\") pod \"nova-operator-controller-manager-697bc559fc-2g56f\" (UID: \"d82caf6e-e4a7-4474-8cd4-6d3f554ce608\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.903460 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgn6f\" (UniqueName: \"kubernetes.io/projected/e8014abc-9e31-40d3-8e34-d595a8ef95b4-kube-api-access-rgn6f\") pod \"octavia-operator-controller-manager-998648c74-9p97b\" (UID: \"e8014abc-9e31-40d3-8e34-d595a8ef95b4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.903480 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hc5l\" (UniqueName: \"kubernetes.io/projected/f4783498-99ba-42cc-9312-8e8c6b279e5a-kube-api-access-5hc5l\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-6tdpt\" (UID: \"f4783498-99ba-42cc-9312-8e8c6b279e5a\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.903513 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-962rp\" (UniqueName: \"kubernetes.io/projected/0667dbf1-e305-4d12-af7b-3d532a834609-kube-api-access-962rp\") pod \"mariadb-operator-controller-manager-56bbcc9d85-lhkdn\" (UID: \"0667dbf1-e305-4d12-af7b-3d532a834609\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.903537 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.905024 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbwwv\" (UniqueName: \"kubernetes.io/projected/09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b-kube-api-access-xbwwv\") pod \"ovn-operator-controller-manager-b6456fdb6-gbqw2\" (UID: \"09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.905642 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd9rs\" (UniqueName: \"kubernetes.io/projected/36d938bc-e3d6-4f21-8327-5f655a4ef54a-kube-api-access-vd9rs\") pod \"manila-operator-controller-manager-7c79b5df47-28bs2\" (UID: \"36d938bc-e3d6-4f21-8327-5f655a4ef54a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:00:35 crc kubenswrapper[4691]: I1202 08:00:35.909886 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.912872 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.915011 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-p674d" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.917331 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.923574 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.924942 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.930106 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-vf5q4" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.978951 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.981549 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hc5l\" (UniqueName: \"kubernetes.io/projected/f4783498-99ba-42cc-9312-8e8c6b279e5a-kube-api-access-5hc5l\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-6tdpt\" (UID: \"f4783498-99ba-42cc-9312-8e8c6b279e5a\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.987903 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd9rs\" (UniqueName: \"kubernetes.io/projected/36d938bc-e3d6-4f21-8327-5f655a4ef54a-kube-api-access-vd9rs\") pod \"manila-operator-controller-manager-7c79b5df47-28bs2\" (UID: \"36d938bc-e3d6-4f21-8327-5f655a4ef54a\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.988112 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4gw6\" (UniqueName: \"kubernetes.io/projected/d82caf6e-e4a7-4474-8cd4-6d3f554ce608-kube-api-access-l4gw6\") pod \"nova-operator-controller-manager-697bc559fc-2g56f\" (UID: \"d82caf6e-e4a7-4474-8cd4-6d3f554ce608\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.989788 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:35.995686 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-962rp\" (UniqueName: \"kubernetes.io/projected/0667dbf1-e305-4d12-af7b-3d532a834609-kube-api-access-962rp\") pod \"mariadb-operator-controller-manager-56bbcc9d85-lhkdn\" (UID: \"0667dbf1-e305-4d12-af7b-3d532a834609\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.007665 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26qcg\" (UniqueName: \"kubernetes.io/projected/fd33241e-270e-4dbe-b024-e368b2050ece-kube-api-access-26qcg\") pod \"placement-operator-controller-manager-78f8948974-nlwnv\" (UID: \"fd33241e-270e-4dbe-b024-e368b2050ece\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.007729 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95tfj\" (UniqueName: \"kubernetes.io/projected/1a9a40d0-4970-4551-b68e-a9697f250e94-kube-api-access-95tfj\") pod \"telemetry-operator-controller-manager-76cc84c6bb-666pb\" (UID: \"1a9a40d0-4970-4551-b68e-a9697f250e94\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.007760 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbwwv\" (UniqueName: \"kubernetes.io/projected/09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b-kube-api-access-xbwwv\") pod \"ovn-operator-controller-manager-b6456fdb6-gbqw2\" (UID: \"09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.007912 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfgtk\" (UniqueName: \"kubernetes.io/projected/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-kube-api-access-vfgtk\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.007947 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgn6f\" (UniqueName: \"kubernetes.io/projected/e8014abc-9e31-40d3-8e34-d595a8ef95b4-kube-api-access-rgn6f\") pod \"octavia-operator-controller-manager-998648c74-9p97b\" (UID: \"e8014abc-9e31-40d3-8e34-d595a8ef95b4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.007984 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.008013 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4dwr\" (UniqueName: \"kubernetes.io/projected/dfa342c0-60d5-4025-a881-30d706833e2b-kube-api-access-h4dwr\") pod \"swift-operator-controller-manager-5f8c65bbfc-slsfn\" (UID: \"dfa342c0-60d5-4025-a881-30d706833e2b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.008206 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.008273 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert podName:e6bec2f4-8aea-472b-a0f9-591b744f9fe4 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:36.508256195 +0000 UTC m=+884.292335057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" (UID: "e6bec2f4-8aea-472b-a0f9-591b744f9fe4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.040110 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbwwv\" (UniqueName: \"kubernetes.io/projected/09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b-kube-api-access-xbwwv\") pod \"ovn-operator-controller-manager-b6456fdb6-gbqw2\" (UID: \"09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.040417 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.041137 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgn6f\" (UniqueName: \"kubernetes.io/projected/e8014abc-9e31-40d3-8e34-d595a8ef95b4-kube-api-access-rgn6f\") pod \"octavia-operator-controller-manager-998648c74-9p97b\" (UID: \"e8014abc-9e31-40d3-8e34-d595a8ef95b4\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.046354 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfgtk\" (UniqueName: \"kubernetes.io/projected/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-kube-api-access-vfgtk\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.046930 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.070675 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.073194 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.085297 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.100404 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.111033 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95tfj\" (UniqueName: \"kubernetes.io/projected/1a9a40d0-4970-4551-b68e-a9697f250e94-kube-api-access-95tfj\") pod \"telemetry-operator-controller-manager-76cc84c6bb-666pb\" (UID: \"1a9a40d0-4970-4551-b68e-a9697f250e94\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.111945 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4dwr\" (UniqueName: \"kubernetes.io/projected/dfa342c0-60d5-4025-a881-30d706833e2b-kube-api-access-h4dwr\") pod \"swift-operator-controller-manager-5f8c65bbfc-slsfn\" (UID: \"dfa342c0-60d5-4025-a881-30d706833e2b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.112585 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26qcg\" (UniqueName: \"kubernetes.io/projected/fd33241e-270e-4dbe-b024-e368b2050ece-kube-api-access-26qcg\") pod \"placement-operator-controller-manager-78f8948974-nlwnv\" (UID: \"fd33241e-270e-4dbe-b024-e368b2050ece\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.113116 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.113268 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.115970 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.128096 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.139144 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.141091 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5nrrb" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.153619 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mx8j9"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.193795 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.195563 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.198492 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.214203 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkbnq\" (UniqueName: \"kubernetes.io/projected/4a080540-9871-4b8c-9d74-e3f1f3cf317c-kube-api-access-dkbnq\") pod \"test-operator-controller-manager-5854674fcc-wnzqd\" (UID: \"4a080540-9871-4b8c-9d74-e3f1f3cf317c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.289983 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5krsn" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.323970 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.324064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkbnq\" (UniqueName: \"kubernetes.io/projected/4a080540-9871-4b8c-9d74-e3f1f3cf317c-kube-api-access-dkbnq\") pod \"test-operator-controller-manager-5854674fcc-wnzqd\" (UID: \"4a080540-9871-4b8c-9d74-e3f1f3cf317c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.324117 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc8ds\" (UniqueName: \"kubernetes.io/projected/f75595fc-314d-4aa9-bc60-e82c16361768-kube-api-access-hc8ds\") pod \"watcher-operator-controller-manager-769dc69bc-4k8xk\" (UID: \"f75595fc-314d-4aa9-bc60-e82c16361768\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.324188 4691 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.324290 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert podName:b930ff47-307d-47b3-9b84-54e5860ee2db nodeName:}" failed. No retries permitted until 2025-12-02 08:00:37.324253619 +0000 UTC m=+885.108332671 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert") pod "infra-operator-controller-manager-57548d458d-n5t7p" (UID: "b930ff47-307d-47b3-9b84-54e5860ee2db") : secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.325959 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4dwr\" (UniqueName: \"kubernetes.io/projected/dfa342c0-60d5-4025-a881-30d706833e2b-kube-api-access-h4dwr\") pod \"swift-operator-controller-manager-5f8c65bbfc-slsfn\" (UID: \"dfa342c0-60d5-4025-a881-30d706833e2b\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.326599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26qcg\" (UniqueName: \"kubernetes.io/projected/fd33241e-270e-4dbe-b024-e368b2050ece-kube-api-access-26qcg\") pod \"placement-operator-controller-manager-78f8948974-nlwnv\" (UID: \"fd33241e-270e-4dbe-b024-e368b2050ece\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.348704 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95tfj\" (UniqueName: \"kubernetes.io/projected/1a9a40d0-4970-4551-b68e-a9697f250e94-kube-api-access-95tfj\") pod \"telemetry-operator-controller-manager-76cc84c6bb-666pb\" (UID: \"1a9a40d0-4970-4551-b68e-a9697f250e94\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.355115 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkbnq\" (UniqueName: \"kubernetes.io/projected/4a080540-9871-4b8c-9d74-e3f1f3cf317c-kube-api-access-dkbnq\") pod \"test-operator-controller-manager-5854674fcc-wnzqd\" (UID: \"4a080540-9871-4b8c-9d74-e3f1f3cf317c\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.382402 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.383593 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.386453 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.386686 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-82m5v" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.391977 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.407412 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.486642 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.487276 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.487342 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h86q9\" (UniqueName: \"kubernetes.io/projected/e837878c-4a1f-463b-913c-7df163c5ba27-kube-api-access-h86q9\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.487380 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.487420 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc8ds\" (UniqueName: \"kubernetes.io/projected/f75595fc-314d-4aa9-bc60-e82c16361768-kube-api-access-hc8ds\") pod \"watcher-operator-controller-manager-769dc69bc-4k8xk\" (UID: \"f75595fc-314d-4aa9-bc60-e82c16361768\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.487790 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.493939 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.499815 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.513310 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc8ds\" (UniqueName: \"kubernetes.io/projected/f75595fc-314d-4aa9-bc60-e82c16361768-kube-api-access-hc8ds\") pod \"watcher-operator-controller-manager-769dc69bc-4k8xk\" (UID: \"f75595fc-314d-4aa9-bc60-e82c16361768\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.514862 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.542154 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh"] Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.543224 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.711809 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.711886 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.711927 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.711962 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h86q9\" (UniqueName: \"kubernetes.io/projected/e837878c-4a1f-463b-913c-7df163c5ba27-kube-api-access-h86q9\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.712837 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.712882 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:37.212867689 +0000 UTC m=+884.996946551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.713036 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.713060 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert podName:e6bec2f4-8aea-472b-a0f9-591b744f9fe4 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:37.713051544 +0000 UTC m=+885.497130406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" (UID: "e6bec2f4-8aea-472b-a0f9-591b744f9fe4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.713096 4691 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: E1202 08:00:36.713113 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:37.213107485 +0000 UTC m=+884.997186347 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "metrics-server-cert" not found Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.714077 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6h29w" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.888316 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h86q9\" (UniqueName: \"kubernetes.io/projected/e837878c-4a1f-463b-913c-7df163c5ba27-kube-api-access-h86q9\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.898447 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ntzn\" (UniqueName: \"kubernetes.io/projected/b42f21ed-f361-4b6f-abc7-03b237501f65-kube-api-access-9ntzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2kbsh\" (UID: \"b42f21ed-f361-4b6f-abc7-03b237501f65\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" Dec 02 08:00:36 crc kubenswrapper[4691]: I1202 08:00:36.983426 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh"] Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.045086 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ntzn\" (UniqueName: \"kubernetes.io/projected/b42f21ed-f361-4b6f-abc7-03b237501f65-kube-api-access-9ntzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2kbsh\" (UID: \"b42f21ed-f361-4b6f-abc7-03b237501f65\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.070583 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ntzn\" (UniqueName: \"kubernetes.io/projected/b42f21ed-f361-4b6f-abc7-03b237501f65-kube-api-access-9ntzn\") pod \"rabbitmq-cluster-operator-manager-668c99d594-2kbsh\" (UID: \"b42f21ed-f361-4b6f-abc7-03b237501f65\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.169383 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv"] Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.175908 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.247723 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.248072 4691 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.248132 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:38.248112246 +0000 UTC m=+886.032191188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "metrics-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.248157 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.248362 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.248448 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:38.248427943 +0000 UTC m=+886.032506885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "webhook-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.278031 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg"] Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.350260 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.350504 4691 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.350564 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert podName:b930ff47-307d-47b3-9b84-54e5860ee2db nodeName:}" failed. No retries permitted until 2025-12-02 08:00:39.350545081 +0000 UTC m=+887.134623943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert") pod "infra-operator-controller-manager-57548d458d-n5t7p" (UID: "b930ff47-307d-47b3-9b84-54e5860ee2db") : secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.757000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.757504 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: E1202 08:00:37.757739 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert podName:e6bec2f4-8aea-472b-a0f9-591b744f9fe4 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:39.757631635 +0000 UTC m=+887.541710507 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" (UID: "e6bec2f4-8aea-472b-a0f9-591b744f9fe4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.770526 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s"] Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.788353 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc"] Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.797690 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2"] Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.990675 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" event={"ID":"5369081c-2142-4dfa-9482-b8d8d6d4195f","Type":"ContainerStarted","Data":"8be34f70c4593d5c78040bfc54eb6f7ee41d934537ba34a10bca90a7c2153e54"} Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.992520 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" event={"ID":"f2666d2b-c30c-40d4-bfab-0e6d00571ecc","Type":"ContainerStarted","Data":"1ebdce6bacc7f162e4c11c3bba3626df933f7dcda00ead5753bb77a5ea5cb4c6"} Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.993770 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" event={"ID":"e88d7782-bcf8-4d40-aa1c-269533471279","Type":"ContainerStarted","Data":"a073dd993faf86f781067304582afb2a5ee295ef9bec748a04b8670d474d19e7"} Dec 02 08:00:37 crc kubenswrapper[4691]: I1202 08:00:37.995020 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" event={"ID":"9e639d67-2200-474e-9be7-55bef7c97fe6","Type":"ContainerStarted","Data":"e3d8b3e698155d4b89c854e94d425d14b76af80ac2cfb9b544309b777054bec6"} Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.002042 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" event={"ID":"36d938bc-e3d6-4f21-8327-5f655a4ef54a","Type":"ContainerStarted","Data":"2aae63503815f1d783fe78dc1f602516e15aa675c945a5fe11c1b44b88655642"} Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.002307 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mx8j9" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="registry-server" containerID="cri-o://4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29" gracePeriod=2 Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.143675 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.151567 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.159832 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.166773 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.178565 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.187666 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.193460 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.216560 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.232207 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.243072 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.248015 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-9p97b"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.254272 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.264729 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.264794 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn"] Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.267821 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.267950 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.268057 4691 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.268108 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:40.268092259 +0000 UTC m=+888.052171121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "metrics-server-cert" not found Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.268211 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.268264 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:40.268237682 +0000 UTC m=+888.052316544 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "webhook-server-cert" not found Dec 02 08:00:38 crc kubenswrapper[4691]: W1202 08:00:38.273313 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a9a40d0_4970_4551_b68e_a9697f250e94.slice/crio-f2bfd5b3170e0e7f43c6652733372bfa76af066be794b9810c84d7343ad079d2 WatchSource:0}: Error finding container f2bfd5b3170e0e7f43c6652733372bfa76af066be794b9810c84d7343ad079d2: Status 404 returned error can't find the container with id f2bfd5b3170e0e7f43c6652733372bfa76af066be794b9810c84d7343ad079d2 Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.277308 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh"] Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.281566 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xbwwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-gbqw2_openstack-operators(09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.284145 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xbwwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-gbqw2_openstack-operators(09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.285394 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podUID="09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.290073 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vxxtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-p2dmm_openstack-operators(a1687ef3-9bf5-451e-aa8a-22ede53d9ed9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.292550 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vxxtx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-p2dmm_openstack-operators(a1687ef3-9bf5-451e-aa8a-22ede53d9ed9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.293645 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" podUID="a1687ef3-9bf5-451e-aa8a-22ede53d9ed9" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.354099 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ntzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-2kbsh_openstack-operators(b42f21ed-f361-4b6f-abc7-03b237501f65): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.354107 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hc5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-6tdpt_openstack-operators(f4783498-99ba-42cc-9312-8e8c6b279e5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.355406 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" podUID="b42f21ed-f361-4b6f-abc7-03b237501f65" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.359091 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkbnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wnzqd_openstack-operators(4a080540-9871-4b8c-9d74-e3f1f3cf317c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.359226 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:2a3d21728a8bfb4e64617e63e61e2d1cb70a383ea3e8f846e0c3c3c02d2b0a9d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h4dwr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-slsfn_openstack-operators(dfa342c0-60d5-4025-a881-30d706833e2b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.359319 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5hc5l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-6tdpt_openstack-operators(f4783498-99ba-42cc-9312-8e8c6b279e5a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.361171 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" podUID="f4783498-99ba-42cc-9312-8e8c6b279e5a" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.381361 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkbnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wnzqd_openstack-operators(4a080540-9871-4b8c-9d74-e3f1f3cf317c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 02 08:00:38 crc kubenswrapper[4691]: E1202 08:00:38.382858 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podUID="4a080540-9871-4b8c-9d74-e3f1f3cf317c" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.724012 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.892892 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-catalog-content\") pod \"acde365e-9d1d-494f-837f-8365c0be9034\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.893064 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxwvp\" (UniqueName: \"kubernetes.io/projected/acde365e-9d1d-494f-837f-8365c0be9034-kube-api-access-nxwvp\") pod \"acde365e-9d1d-494f-837f-8365c0be9034\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.893112 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-utilities\") pod \"acde365e-9d1d-494f-837f-8365c0be9034\" (UID: \"acde365e-9d1d-494f-837f-8365c0be9034\") " Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.894410 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-utilities" (OuterVolumeSpecName: "utilities") pod "acde365e-9d1d-494f-837f-8365c0be9034" (UID: "acde365e-9d1d-494f-837f-8365c0be9034"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.902964 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acde365e-9d1d-494f-837f-8365c0be9034-kube-api-access-nxwvp" (OuterVolumeSpecName: "kube-api-access-nxwvp") pod "acde365e-9d1d-494f-837f-8365c0be9034" (UID: "acde365e-9d1d-494f-837f-8365c0be9034"). InnerVolumeSpecName "kube-api-access-nxwvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.982250 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acde365e-9d1d-494f-837f-8365c0be9034" (UID: "acde365e-9d1d-494f-837f-8365c0be9034"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.995183 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.995228 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxwvp\" (UniqueName: \"kubernetes.io/projected/acde365e-9d1d-494f-837f-8365c0be9034-kube-api-access-nxwvp\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:38 crc kubenswrapper[4691]: I1202 08:00:38.995244 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acde365e-9d1d-494f-837f-8365c0be9034-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.019174 4691 generic.go:334] "Generic (PLEG): container finished" podID="acde365e-9d1d-494f-837f-8365c0be9034" containerID="4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29" exitCode=0 Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.019221 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerDied","Data":"4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.019257 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mx8j9" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.019277 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mx8j9" event={"ID":"acde365e-9d1d-494f-837f-8365c0be9034","Type":"ContainerDied","Data":"3e81c5caaf1a11bcd64644823751ecfbd57d7a9340acf18e819ccee7d909119b"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.019297 4691 scope.go:117] "RemoveContainer" containerID="4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.024448 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" event={"ID":"0361226c-435b-4221-b59d-74900b2552e1","Type":"ContainerStarted","Data":"2351d24c9fcf47b3cef78af8336caeb8c8d14af40f0fa2a181101492cd191e79"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.026023 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" event={"ID":"f4783498-99ba-42cc-9312-8e8c6b279e5a","Type":"ContainerStarted","Data":"0432022ad51fe2114a491c7771f77d9d9be7749387ff836fcaf4740f91300acd"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.072291 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" event={"ID":"4a080540-9871-4b8c-9d74-e3f1f3cf317c","Type":"ContainerStarted","Data":"918fcb7a290f9a9da393c6093c4ad74361b5e2bd7d49e44410d24df4e446ba44"} Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.074516 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" podUID="f4783498-99ba-42cc-9312-8e8c6b279e5a" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.074661 4691 scope.go:117] "RemoveContainer" containerID="ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.076264 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podUID="4a080540-9871-4b8c-9d74-e3f1f3cf317c" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.076412 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" event={"ID":"759a905a-dc61-4206-862f-cb8b6f85882f","Type":"ContainerStarted","Data":"a04ac1a1a810be7fa90a1be44083464c712027cb2e73bbb1343bf845d08c0bc3"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.081058 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" event={"ID":"e8014abc-9e31-40d3-8e34-d595a8ef95b4","Type":"ContainerStarted","Data":"aea27e3c15dc76cf67e74f4c7c5a388dc746bce268bc3eb8d1ac594a7b480852"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.084353 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" event={"ID":"f75595fc-314d-4aa9-bc60-e82c16361768","Type":"ContainerStarted","Data":"404fb9fa3f8c12f7e1eb70da16a3e0e9e45ec5e130dd69d65229c1c9bb3bc93d"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.093549 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" event={"ID":"0667dbf1-e305-4d12-af7b-3d532a834609","Type":"ContainerStarted","Data":"d24312dec04e7833168646abedd151622d5a3f059e0d8f1f82557cd1edcbe464"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.096113 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mx8j9"] Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.096146 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" event={"ID":"fd33241e-270e-4dbe-b024-e368b2050ece","Type":"ContainerStarted","Data":"4100e3eecd93dd7559edb6b6457bdae2fafd59a6dd0cbf15427ea04f17d4ee8e"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.097887 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" event={"ID":"09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b","Type":"ContainerStarted","Data":"1b667599e84148b86e45c17beee8a4fb8d0a1a19b49129e6b7d7b0ae8f214df1"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.116384 4691 scope.go:117] "RemoveContainer" containerID="84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.120663 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podUID="09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.121609 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" event={"ID":"1a9a40d0-4970-4551-b68e-a9697f250e94","Type":"ContainerStarted","Data":"f2bfd5b3170e0e7f43c6652733372bfa76af066be794b9810c84d7343ad079d2"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.126812 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" event={"ID":"b42f21ed-f361-4b6f-abc7-03b237501f65","Type":"ContainerStarted","Data":"582f92769a3b057acd68f009c6d11d36d0726c8ce71dfc77889313236d6f0ec6"} Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.129469 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" podUID="b42f21ed-f361-4b6f-abc7-03b237501f65" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.136701 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" event={"ID":"c92e2d03-8848-432f-82f4-fd28b3b0fa34","Type":"ContainerStarted","Data":"2b29428aec5e7b7c250e2ee19fa50186ec8bdd860a44ed6c60f32add43e1c024"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.146231 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" event={"ID":"a1687ef3-9bf5-451e-aa8a-22ede53d9ed9","Type":"ContainerStarted","Data":"80be8953e61c2892dc590c162d321ae0037f8dc50da754aa57a637aed77d461b"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.148648 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" event={"ID":"dfa342c0-60d5-4025-a881-30d706833e2b","Type":"ContainerStarted","Data":"e8efea6dd941c83e8bea7ef1be61444922166e967271c0962d2a331b4bfdc4e3"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.150097 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" event={"ID":"d82caf6e-e4a7-4474-8cd4-6d3f554ce608","Type":"ContainerStarted","Data":"1aa2e18a387988cc91b56216579ca5c01adc8c1ac78d4d3f29afd7f351b369c5"} Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.160188 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mx8j9"] Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.190534 4691 scope.go:117] "RemoveContainer" containerID="4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.191259 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29\": container with ID starting with 4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29 not found: ID does not exist" containerID="4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.191297 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29"} err="failed to get container status \"4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29\": rpc error: code = NotFound desc = could not find container \"4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29\": container with ID starting with 4b4758830010ac5ac8ec1fb388c87bb52e250762c08da73ef21a075c6dac5a29 not found: ID does not exist" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.191324 4691 scope.go:117] "RemoveContainer" containerID="ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.191642 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17\": container with ID starting with ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17 not found: ID does not exist" containerID="ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.191689 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17"} err="failed to get container status \"ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17\": rpc error: code = NotFound desc = could not find container \"ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17\": container with ID starting with ce6031b3742a89ac1a0469f4f3e4e0761e8a0760cf29a3c789651b48acfaed17 not found: ID does not exist" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.191707 4691 scope.go:117] "RemoveContainer" containerID="84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.191996 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2\": container with ID starting with 84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2 not found: ID does not exist" containerID="84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.192019 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2"} err="failed to get container status \"84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2\": rpc error: code = NotFound desc = could not find container \"84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2\": container with ID starting with 84281e39ae271ce38e578df2791dafe530e92e9d45f37dff445f891419c28ba2 not found: ID does not exist" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.192717 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" podUID="a1687ef3-9bf5-451e-aa8a-22ede53d9ed9" Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.403346 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.404288 4691 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.404369 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert podName:b930ff47-307d-47b3-9b84-54e5860ee2db nodeName:}" failed. No retries permitted until 2025-12-02 08:00:43.404351885 +0000 UTC m=+891.188430747 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert") pod "infra-operator-controller-manager-57548d458d-n5t7p" (UID: "b930ff47-307d-47b3-9b84-54e5860ee2db") : secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:39 crc kubenswrapper[4691]: I1202 08:00:39.812537 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.812739 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:39 crc kubenswrapper[4691]: E1202 08:00:39.812829 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert podName:e6bec2f4-8aea-472b-a0f9-591b744f9fe4 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:43.812811313 +0000 UTC m=+891.596890175 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" (UID: "e6bec2f4-8aea-472b-a0f9-591b744f9fe4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.166396 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" podUID="f4783498-99ba-42cc-9312-8e8c6b279e5a" Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.167226 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" podUID="b42f21ed-f361-4b6f-abc7-03b237501f65" Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.167450 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podUID="09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b" Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.167523 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" podUID="a1687ef3-9bf5-451e-aa8a-22ede53d9ed9" Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.182187 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podUID="4a080540-9871-4b8c-9d74-e3f1f3cf317c" Dec 02 08:00:40 crc kubenswrapper[4691]: I1202 08:00:40.321432 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:40 crc kubenswrapper[4691]: I1202 08:00:40.321588 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.321622 4691 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.321696 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:44.321676157 +0000 UTC m=+892.105755019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "metrics-server-cert" not found Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.321925 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 08:00:40 crc kubenswrapper[4691]: E1202 08:00:40.323316 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:44.321971774 +0000 UTC m=+892.106050676 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "webhook-server-cert" not found Dec 02 08:00:40 crc kubenswrapper[4691]: I1202 08:00:40.582027 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acde365e-9d1d-494f-837f-8365c0be9034" path="/var/lib/kubelet/pods/acde365e-9d1d-494f-837f-8365c0be9034/volumes" Dec 02 08:00:43 crc kubenswrapper[4691]: I1202 08:00:43.495085 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:43 crc kubenswrapper[4691]: E1202 08:00:43.495264 4691 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:43 crc kubenswrapper[4691]: E1202 08:00:43.495840 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert podName:b930ff47-307d-47b3-9b84-54e5860ee2db nodeName:}" failed. No retries permitted until 2025-12-02 08:00:51.495822516 +0000 UTC m=+899.279901368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert") pod "infra-operator-controller-manager-57548d458d-n5t7p" (UID: "b930ff47-307d-47b3-9b84-54e5860ee2db") : secret "infra-operator-webhook-server-cert" not found Dec 02 08:00:43 crc kubenswrapper[4691]: I1202 08:00:43.901790 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:43 crc kubenswrapper[4691]: E1202 08:00:43.901950 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:43 crc kubenswrapper[4691]: E1202 08:00:43.902019 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert podName:e6bec2f4-8aea-472b-a0f9-591b744f9fe4 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:51.902000328 +0000 UTC m=+899.686079190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" (UID: "e6bec2f4-8aea-472b-a0f9-591b744f9fe4") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 02 08:00:44 crc kubenswrapper[4691]: I1202 08:00:44.325262 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:44 crc kubenswrapper[4691]: I1202 08:00:44.325533 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:44 crc kubenswrapper[4691]: E1202 08:00:44.325564 4691 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 02 08:00:44 crc kubenswrapper[4691]: E1202 08:00:44.325719 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:52.32570063 +0000 UTC m=+900.109779492 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "metrics-server-cert" not found Dec 02 08:00:44 crc kubenswrapper[4691]: E1202 08:00:44.325655 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 08:00:44 crc kubenswrapper[4691]: E1202 08:00:44.325898 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:00:52.325874325 +0000 UTC m=+900.109953197 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "webhook-server-cert" not found Dec 02 08:00:51 crc kubenswrapper[4691]: I1202 08:00:51.535818 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:51 crc kubenswrapper[4691]: I1202 08:00:51.542806 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b930ff47-307d-47b3-9b84-54e5860ee2db-cert\") pod \"infra-operator-controller-manager-57548d458d-n5t7p\" (UID: \"b930ff47-307d-47b3-9b84-54e5860ee2db\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:51 crc kubenswrapper[4691]: I1202 08:00:51.741434 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:00:51 crc kubenswrapper[4691]: I1202 08:00:51.941966 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:51 crc kubenswrapper[4691]: I1202 08:00:51.945709 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e6bec2f4-8aea-472b-a0f9-591b744f9fe4-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v\" (UID: \"e6bec2f4-8aea-472b-a0f9-591b744f9fe4\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:52 crc kubenswrapper[4691]: I1202 08:00:52.048406 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:00:52 crc kubenswrapper[4691]: I1202 08:00:52.348079 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:52 crc kubenswrapper[4691]: I1202 08:00:52.348564 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:52 crc kubenswrapper[4691]: E1202 08:00:52.348704 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 02 08:00:52 crc kubenswrapper[4691]: E1202 08:00:52.348811 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs podName:e837878c-4a1f-463b-913c-7df163c5ba27 nodeName:}" failed. No retries permitted until 2025-12-02 08:01:08.348788936 +0000 UTC m=+916.132867798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs") pod "openstack-operator-controller-manager-85b7b84db-d9nql" (UID: "e837878c-4a1f-463b-913c-7df163c5ba27") : secret "webhook-server-cert" not found Dec 02 08:00:52 crc kubenswrapper[4691]: I1202 08:00:52.351260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-metrics-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:00:55 crc kubenswrapper[4691]: E1202 08:00:55.591526 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 02 08:00:55 crc kubenswrapper[4691]: E1202 08:00:55.591783 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-87p6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-66zbg_openstack-operators(e88d7782-bcf8-4d40-aa1c-269533471279): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:00:56 crc kubenswrapper[4691]: E1202 08:00:56.789264 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 02 08:00:56 crc kubenswrapper[4691]: E1202 08:00:56.789809 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-25rkj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-n5f6s_openstack-operators(f2666d2b-c30c-40d4-bfab-0e6d00571ecc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:00 crc kubenswrapper[4691]: E1202 08:01:00.037017 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 02 08:01:00 crc kubenswrapper[4691]: E1202 08:01:00.037527 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26qcg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-nlwnv_openstack-operators(fd33241e-270e-4dbe-b024-e368b2050ece): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:02 crc kubenswrapper[4691]: E1202 08:01:02.180040 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 02 08:01:02 crc kubenswrapper[4691]: E1202 08:01:02.180311 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2l7r4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-w9hbv_openstack-operators(5369081c-2142-4dfa-9482-b8d8d6d4195f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.790466 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jrs4t"] Dec 02 08:01:02 crc kubenswrapper[4691]: E1202 08:01:02.791282 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="registry-server" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.791304 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="registry-server" Dec 02 08:01:02 crc kubenswrapper[4691]: E1202 08:01:02.791332 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="extract-content" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.791345 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="extract-content" Dec 02 08:01:02 crc kubenswrapper[4691]: E1202 08:01:02.791367 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="extract-utilities" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.791377 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="extract-utilities" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.791516 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="acde365e-9d1d-494f-837f-8365c0be9034" containerName="registry-server" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.792908 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.799533 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrs4t"] Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.893933 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-catalog-content\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.894176 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsq7v\" (UniqueName: \"kubernetes.io/projected/ca399b1a-df12-435c-8ebf-6f787ed29a8e-kube-api-access-rsq7v\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.894661 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-utilities\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.996102 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-catalog-content\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.996231 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsq7v\" (UniqueName: \"kubernetes.io/projected/ca399b1a-df12-435c-8ebf-6f787ed29a8e-kube-api-access-rsq7v\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.996299 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-utilities\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.996791 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-catalog-content\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:02 crc kubenswrapper[4691]: I1202 08:01:02.997028 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-utilities\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:03 crc kubenswrapper[4691]: I1202 08:01:03.105105 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsq7v\" (UniqueName: \"kubernetes.io/projected/ca399b1a-df12-435c-8ebf-6f787ed29a8e-kube-api-access-rsq7v\") pod \"redhat-operators-jrs4t\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:03 crc kubenswrapper[4691]: I1202 08:01:03.122576 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:05 crc kubenswrapper[4691]: E1202 08:01:05.017421 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 02 08:01:05 crc kubenswrapper[4691]: E1202 08:01:05.017894 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-q48mz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-d898r_openstack-operators(c92e2d03-8848-432f-82f4-fd28b3b0fa34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:06 crc kubenswrapper[4691]: E1202 08:01:06.329808 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85" Dec 02 08:01:06 crc kubenswrapper[4691]: E1202 08:01:06.330063 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:9f68d7bc8c6bce38f46dee8a8272d5365c49fe7b32b2af52e8ac884e212f3a85,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9vw5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-78b4bc895b-c7ftc_openstack-operators(9e639d67-2200-474e-9be7-55bef7c97fe6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:06 crc kubenswrapper[4691]: I1202 08:01:06.332682 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:01:06 crc kubenswrapper[4691]: E1202 08:01:06.672894 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7" Dec 02 08:01:06 crc kubenswrapper[4691]: E1202 08:01:06.673442 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:600ca007e493d3af0fcc2ebac92e8da5efd2afe812b62d7d3d4dd0115bdf05d7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-962rp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-lhkdn_openstack-operators(0667dbf1-e305-4d12-af7b-3d532a834609): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:07 crc kubenswrapper[4691]: E1202 08:01:07.190497 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 02 08:01:07 crc kubenswrapper[4691]: E1202 08:01:07.190729 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-887qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-44twc_openstack-operators(759a905a-dc61-4206-862f-cb8b6f85882f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:07 crc kubenswrapper[4691]: E1202 08:01:07.819314 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 02 08:01:07 crc kubenswrapper[4691]: E1202 08:01:07.819493 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vd9rs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-28bs2_openstack-operators(36d938bc-e3d6-4f21-8327-5f655a4ef54a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:08 crc kubenswrapper[4691]: I1202 08:01:08.398841 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:01:08 crc kubenswrapper[4691]: I1202 08:01:08.406333 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/e837878c-4a1f-463b-913c-7df163c5ba27-webhook-certs\") pod \"openstack-operator-controller-manager-85b7b84db-d9nql\" (UID: \"e837878c-4a1f-463b-913c-7df163c5ba27\") " pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:01:08 crc kubenswrapper[4691]: I1202 08:01:08.602992 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-82m5v" Dec 02 08:01:08 crc kubenswrapper[4691]: I1202 08:01:08.612561 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:01:10 crc kubenswrapper[4691]: E1202 08:01:10.526215 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 02 08:01:10 crc kubenswrapper[4691]: E1202 08:01:10.526471 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l4gw6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-2g56f_openstack-operators(d82caf6e-e4a7-4474-8cd4-6d3f554ce608): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:14 crc kubenswrapper[4691]: E1202 08:01:14.900955 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 02 08:01:14 crc kubenswrapper[4691]: E1202 08:01:14.901705 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xbwwv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-gbqw2_openstack-operators(09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:15 crc kubenswrapper[4691]: E1202 08:01:15.602377 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 02 08:01:15 crc kubenswrapper[4691]: E1202 08:01:15.603097 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rgn6f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-9p97b_openstack-operators(e8014abc-9e31-40d3-8e34-d595a8ef95b4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:16 crc kubenswrapper[4691]: E1202 08:01:16.371032 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94" Dec 02 08:01:16 crc kubenswrapper[4691]: E1202 08:01:16.371255 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dkbnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-wnzqd_openstack-operators(4a080540-9871-4b8c-9d74-e3f1f3cf317c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:17 crc kubenswrapper[4691]: E1202 08:01:17.625090 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 02 08:01:17 crc kubenswrapper[4691]: E1202 08:01:17.625510 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ntzn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-2kbsh_openstack-operators(b42f21ed-f361-4b6f-abc7-03b237501f65): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:17 crc kubenswrapper[4691]: E1202 08:01:17.627081 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" podUID="b42f21ed-f361-4b6f-abc7-03b237501f65" Dec 02 08:01:17 crc kubenswrapper[4691]: E1202 08:01:17.667295 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 08:01:17 crc kubenswrapper[4691]: E1202 08:01:17.667439 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-87p6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-66zbg_openstack-operators(e88d7782-bcf8-4d40-aa1c-269533471279): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 02 08:01:17 crc kubenswrapper[4691]: E1202 08:01:17.668590 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"]" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" podUID="e88d7782-bcf8-4d40-aa1c-269533471279" Dec 02 08:01:18 crc kubenswrapper[4691]: I1202 08:01:18.213988 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jrs4t"] Dec 02 08:01:18 crc kubenswrapper[4691]: I1202 08:01:18.220439 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p"] Dec 02 08:01:18 crc kubenswrapper[4691]: I1202 08:01:18.237778 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v"] Dec 02 08:01:18 crc kubenswrapper[4691]: I1202 08:01:18.271314 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql"] Dec 02 08:01:19 crc kubenswrapper[4691]: W1202 08:01:19.038523 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca399b1a_df12_435c_8ebf_6f787ed29a8e.slice/crio-2b9632d6c61716f54cf04ffcbec6588f4c166c5ce973a8991ac752aa3b9718af WatchSource:0}: Error finding container 2b9632d6c61716f54cf04ffcbec6588f4c166c5ce973a8991ac752aa3b9718af: Status 404 returned error can't find the container with id 2b9632d6c61716f54cf04ffcbec6588f4c166c5ce973a8991ac752aa3b9718af Dec 02 08:01:19 crc kubenswrapper[4691]: E1202 08:01:19.041587 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 02 08:01:19 crc kubenswrapper[4691]: E1202 08:01:19.041736 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h4dwr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-slsfn_openstack-operators(dfa342c0-60d5-4025-a881-30d706833e2b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:01:19 crc kubenswrapper[4691]: W1202 08:01:19.042027 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6bec2f4_8aea_472b_a0f9_591b744f9fe4.slice/crio-05494252853cb68384bd1315370e45b2ddee472ba42eb0dd41613acc29566e10 WatchSource:0}: Error finding container 05494252853cb68384bd1315370e45b2ddee472ba42eb0dd41613acc29566e10: Status 404 returned error can't find the container with id 05494252853cb68384bd1315370e45b2ddee472ba42eb0dd41613acc29566e10 Dec 02 08:01:19 crc kubenswrapper[4691]: E1202 08:01:19.043167 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" podUID="dfa342c0-60d5-4025-a881-30d706833e2b" Dec 02 08:01:19 crc kubenswrapper[4691]: W1202 08:01:19.043619 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb930ff47_307d_47b3_9b84_54e5860ee2db.slice/crio-e15f9cbd3a45f5d35b7c50543f402343f2eed0693397396bb81b1cd41a0152a8 WatchSource:0}: Error finding container e15f9cbd3a45f5d35b7c50543f402343f2eed0693397396bb81b1cd41a0152a8: Status 404 returned error can't find the container with id e15f9cbd3a45f5d35b7c50543f402343f2eed0693397396bb81b1cd41a0152a8 Dec 02 08:01:19 crc kubenswrapper[4691]: W1202 08:01:19.047465 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode837878c_4a1f_463b_913c_7df163c5ba27.slice/crio-2cb15d8277687b7627d403cc8c9f56a407a30b11d9eab6a91ef38f26bc83be06 WatchSource:0}: Error finding container 2cb15d8277687b7627d403cc8c9f56a407a30b11d9eab6a91ef38f26bc83be06: Status 404 returned error can't find the container with id 2cb15d8277687b7627d403cc8c9f56a407a30b11d9eab6a91ef38f26bc83be06 Dec 02 08:01:19 crc kubenswrapper[4691]: I1202 08:01:19.994861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" event={"ID":"e837878c-4a1f-463b-913c-7df163c5ba27","Type":"ContainerStarted","Data":"c00dcaae89d2e3d1df7005e5e687cd9089a84d085a3758593f15e3fc1d118d42"} Dec 02 08:01:19 crc kubenswrapper[4691]: I1202 08:01:19.995371 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" event={"ID":"e837878c-4a1f-463b-913c-7df163c5ba27","Type":"ContainerStarted","Data":"2cb15d8277687b7627d403cc8c9f56a407a30b11d9eab6a91ef38f26bc83be06"} Dec 02 08:01:19 crc kubenswrapper[4691]: I1202 08:01:19.995538 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:01:19 crc kubenswrapper[4691]: I1202 08:01:19.997587 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" event={"ID":"f75595fc-314d-4aa9-bc60-e82c16361768","Type":"ContainerStarted","Data":"fda6cc7fd612c09d0047bd6bc7ff07c052c3d7c1fdb8b4b4e8da8ec2d5c611f8"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.002241 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" event={"ID":"1a9a40d0-4970-4551-b68e-a9697f250e94","Type":"ContainerStarted","Data":"1c9135caa0751b59040eaa602a5910e9db3bb882099f53f7a42426d0d8aebeaa"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.003288 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" event={"ID":"b930ff47-307d-47b3-9b84-54e5860ee2db","Type":"ContainerStarted","Data":"e15f9cbd3a45f5d35b7c50543f402343f2eed0693397396bb81b1cd41a0152a8"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.009252 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" event={"ID":"a1687ef3-9bf5-451e-aa8a-22ede53d9ed9","Type":"ContainerStarted","Data":"25d86911dfedac33dc1828b4f6c693ca81d8527f652b4d7d2319d4bd2a8d07a1"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.013215 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" event={"ID":"f4783498-99ba-42cc-9312-8e8c6b279e5a","Type":"ContainerStarted","Data":"557bf45b8be17b11930ed1551eb44d6ac5fb090e381a5d0685d12e561e12aad7"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.026081 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerID="619aea510c1d1a8bb66172719818ef85cbb96788b06f0df3e5f877d2b13335e2" exitCode=0 Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.026198 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerDied","Data":"619aea510c1d1a8bb66172719818ef85cbb96788b06f0df3e5f877d2b13335e2"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.026528 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerStarted","Data":"2b9632d6c61716f54cf04ffcbec6588f4c166c5ce973a8991ac752aa3b9718af"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.029332 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" event={"ID":"e6bec2f4-8aea-472b-a0f9-591b744f9fe4","Type":"ContainerStarted","Data":"05494252853cb68384bd1315370e45b2ddee472ba42eb0dd41613acc29566e10"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.031360 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" event={"ID":"0361226c-435b-4221-b59d-74900b2552e1","Type":"ContainerStarted","Data":"1545fc0a63ec1e780aa1fd52a4d6f99cc282c05fb45942f58990abb14ce4278b"} Dec 02 08:01:20 crc kubenswrapper[4691]: I1202 08:01:20.044476 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" podStartSLOduration=44.044457771 podStartE2EDuration="44.044457771s" podCreationTimestamp="2025-12-02 08:00:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:01:20.03606529 +0000 UTC m=+927.820144172" watchObservedRunningTime="2025-12-02 08:01:20.044457771 +0000 UTC m=+927.828536633" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.364308 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" podUID="759a905a-dc61-4206-862f-cb8b6f85882f" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.630229 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" podUID="d82caf6e-e4a7-4474-8cd4-6d3f554ce608" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.639143 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podUID="4a080540-9871-4b8c-9d74-e3f1f3cf317c" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.645665 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podUID="09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.647334 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" podUID="c92e2d03-8848-432f-82f4-fd28b3b0fa34" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.667225 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" podUID="0667dbf1-e305-4d12-af7b-3d532a834609" Dec 02 08:01:20 crc kubenswrapper[4691]: E1202 08:01:20.676362 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" podUID="36d938bc-e3d6-4f21-8327-5f655a4ef54a" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.084289 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" event={"ID":"a1687ef3-9bf5-451e-aa8a-22ede53d9ed9","Type":"ContainerStarted","Data":"bfc92b4594a308fc3b9bf4d7d8fac13d53e903ddd223c5bbe0fe9e7a41198cec"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.085424 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.087555 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" event={"ID":"f75595fc-314d-4aa9-bc60-e82c16361768","Type":"ContainerStarted","Data":"ad6e182940ad78d4fd2ab935a954d75406bd92074329f0dbe299062784adf3c8"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.088077 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.090081 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" event={"ID":"f4783498-99ba-42cc-9312-8e8c6b279e5a","Type":"ContainerStarted","Data":"f846ad9b1289e559ef4f77029e639be544bd3f300d89f94ce307cbc0cdbe7f63"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.090530 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.092858 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" event={"ID":"09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b","Type":"ContainerStarted","Data":"d5f69d37780014fe23f306f0074407cb05ab2fdb46eb8f755b671fd6c0694324"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.094678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" event={"ID":"d82caf6e-e4a7-4474-8cd4-6d3f554ce608","Type":"ContainerStarted","Data":"d68aae92757dd7ee6ff8fecc5cb3db2ebd73f5a7f905c8a8a87a42a188395082"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.096020 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" event={"ID":"0667dbf1-e305-4d12-af7b-3d532a834609","Type":"ContainerStarted","Data":"45d2300d930967207025b34c1921ef98e26231c0766f7398b2a0491dfad50ad9"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.098266 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" event={"ID":"36d938bc-e3d6-4f21-8327-5f655a4ef54a","Type":"ContainerStarted","Data":"6ec6fa1724ef2619c63711eb9a43828a3df6d16b965d7c4e5345c2c062de457f"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.100240 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" event={"ID":"4a080540-9871-4b8c-9d74-e3f1f3cf317c","Type":"ContainerStarted","Data":"dbd86a5f011eb33ece5d4603f843925644b84678249adf0c45710ec932871501"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.102578 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" event={"ID":"1a9a40d0-4970-4551-b68e-a9697f250e94","Type":"ContainerStarted","Data":"c01a666fbb0b6efb045f97f90a7c5101a6c485d82060c6a84afab9b24635149d"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.102776 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.105183 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" event={"ID":"759a905a-dc61-4206-862f-cb8b6f85882f","Type":"ContainerStarted","Data":"fc13ce660f855f89503c559b89f503020306d60b15e006ae6208dab16fead508"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.111783 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" podStartSLOduration=6.735089591 podStartE2EDuration="46.111746664s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.289937138 +0000 UTC m=+886.074016000" lastFinishedPulling="2025-12-02 08:01:17.666594211 +0000 UTC m=+925.450673073" observedRunningTime="2025-12-02 08:01:21.104557083 +0000 UTC m=+928.888635945" watchObservedRunningTime="2025-12-02 08:01:21.111746664 +0000 UTC m=+928.895825526" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.121187 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" event={"ID":"c92e2d03-8848-432f-82f4-fd28b3b0fa34","Type":"ContainerStarted","Data":"6f44371772e7a6909deb3c07ec054f3913f495c0e41194d4b52130e7d0dd9784"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.124678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" event={"ID":"0361226c-435b-4221-b59d-74900b2552e1","Type":"ContainerStarted","Data":"183e13f90fa6566bb2e051cacb7366b179ce77f4363b2bed7a3898777ae2c4a2"} Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.124708 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.138496 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" podStartSLOduration=15.641406953 podStartE2EDuration="46.138470446s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.242746022 +0000 UTC m=+886.026824884" lastFinishedPulling="2025-12-02 08:01:08.739809515 +0000 UTC m=+916.523888377" observedRunningTime="2025-12-02 08:01:21.136307331 +0000 UTC m=+928.920386193" watchObservedRunningTime="2025-12-02 08:01:21.138470446 +0000 UTC m=+928.922549308" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.154715 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" podStartSLOduration=13.923483122 podStartE2EDuration="46.154696104s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.276528821 +0000 UTC m=+886.060607683" lastFinishedPulling="2025-12-02 08:01:10.507741793 +0000 UTC m=+918.291820665" observedRunningTime="2025-12-02 08:01:21.15296089 +0000 UTC m=+928.937039752" watchObservedRunningTime="2025-12-02 08:01:21.154696104 +0000 UTC m=+928.938774966" Dec 02 08:01:21 crc kubenswrapper[4691]: E1202 08:01:21.178082 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podUID="09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b" Dec 02 08:01:21 crc kubenswrapper[4691]: E1202 08:01:21.178183 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podUID="4a080540-9871-4b8c-9d74-e3f1f3cf317c" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.373448 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" podStartSLOduration=7.046798628 podStartE2EDuration="46.373425663s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.353964128 +0000 UTC m=+886.138042990" lastFinishedPulling="2025-12-02 08:01:17.680591173 +0000 UTC m=+925.464670025" observedRunningTime="2025-12-02 08:01:21.37013385 +0000 UTC m=+929.154212722" watchObservedRunningTime="2025-12-02 08:01:21.373425663 +0000 UTC m=+929.157504525" Dec 02 08:01:21 crc kubenswrapper[4691]: I1202 08:01:21.468473 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" podStartSLOduration=16.796992604 podStartE2EDuration="46.468452102s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.242196858 +0000 UTC m=+886.026275720" lastFinishedPulling="2025-12-02 08:01:07.913656356 +0000 UTC m=+915.697735218" observedRunningTime="2025-12-02 08:01:21.466589645 +0000 UTC m=+929.250668507" watchObservedRunningTime="2025-12-02 08:01:21.468452102 +0000 UTC m=+929.252530964" Dec 02 08:01:21 crc kubenswrapper[4691]: E1202 08:01:21.523219 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" podUID="fd33241e-270e-4dbe-b024-e368b2050ece" Dec 02 08:01:22 crc kubenswrapper[4691]: I1202 08:01:22.154795 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" event={"ID":"fd33241e-270e-4dbe-b024-e368b2050ece","Type":"ContainerStarted","Data":"92abb2eac3b0fd9eed54c1dfd6d9260a427e502ab939e7a6bcae8c180f7386b3"} Dec 02 08:01:22 crc kubenswrapper[4691]: I1202 08:01:22.179695 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerStarted","Data":"840593c951252ba9f9ee00e56e405164fde37a5ff4a9987d2fe1a3f50230e853"} Dec 02 08:01:22 crc kubenswrapper[4691]: I1202 08:01:22.193479 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" event={"ID":"e88d7782-bcf8-4d40-aa1c-269533471279","Type":"ContainerStarted","Data":"0cd2e3740445c75cf4726f498d251898a82e06b2614e8c948f948d4bd26e52b2"} Dec 02 08:01:22 crc kubenswrapper[4691]: E1202 08:01:22.740180 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" podUID="f2666d2b-c30c-40d4-bfab-0e6d00571ecc" Dec 02 08:01:23 crc kubenswrapper[4691]: I1202 08:01:23.210022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" event={"ID":"f2666d2b-c30c-40d4-bfab-0e6d00571ecc","Type":"ContainerStarted","Data":"966fa7f25bf3f60ebe0f83735c1facd47fd577d6804bf79ef1fa6ea60dd6b5c9"} Dec 02 08:01:23 crc kubenswrapper[4691]: E1202 08:01:23.248524 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" podUID="e8014abc-9e31-40d3-8e34-d595a8ef95b4" Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.258821 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" event={"ID":"c92e2d03-8848-432f-82f4-fd28b3b0fa34","Type":"ContainerStarted","Data":"2f4347c85c5ebe2b501d3e5d78d1878c6652414e321c7da472400eb3ff0efe5c"} Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.261905 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.266223 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" event={"ID":"e88d7782-bcf8-4d40-aa1c-269533471279","Type":"ContainerStarted","Data":"7d8c0a26144bbfd82e391c148c5a5abdcf8ee42ab49f2723cfb0a9a1190e07bc"} Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.267175 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.272130 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" event={"ID":"e8014abc-9e31-40d3-8e34-d595a8ef95b4","Type":"ContainerStarted","Data":"ec5a2512742f4efe32a72aafe8196bde54795297c34683c979d0f7ce7f638d46"} Dec 02 08:01:24 crc kubenswrapper[4691]: E1202 08:01:24.274610 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" podUID="e8014abc-9e31-40d3-8e34-d595a8ef95b4" Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.282311 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" podStartSLOduration=4.464940529 podStartE2EDuration="49.282293393s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.271597217 +0000 UTC m=+886.055676089" lastFinishedPulling="2025-12-02 08:01:23.088950091 +0000 UTC m=+930.873028953" observedRunningTime="2025-12-02 08:01:24.277040781 +0000 UTC m=+932.061119663" watchObservedRunningTime="2025-12-02 08:01:24.282293393 +0000 UTC m=+932.066372255" Dec 02 08:01:24 crc kubenswrapper[4691]: I1202 08:01:24.330882 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" podStartSLOduration=5.372457374 podStartE2EDuration="49.330860624s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:37.338881088 +0000 UTC m=+885.122959950" lastFinishedPulling="2025-12-02 08:01:21.297284338 +0000 UTC m=+929.081363200" observedRunningTime="2025-12-02 08:01:24.313676352 +0000 UTC m=+932.097755214" watchObservedRunningTime="2025-12-02 08:01:24.330860624 +0000 UTC m=+932.114939486" Dec 02 08:01:25 crc kubenswrapper[4691]: I1202 08:01:25.291925 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerID="840593c951252ba9f9ee00e56e405164fde37a5ff4a9987d2fe1a3f50230e853" exitCode=0 Dec 02 08:01:25 crc kubenswrapper[4691]: I1202 08:01:25.292109 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerDied","Data":"840593c951252ba9f9ee00e56e405164fde37a5ff4a9987d2fe1a3f50230e853"} Dec 02 08:01:25 crc kubenswrapper[4691]: E1202 08:01:25.296230 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" podUID="e8014abc-9e31-40d3-8e34-d595a8ef95b4" Dec 02 08:01:25 crc kubenswrapper[4691]: I1202 08:01:25.817536 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-qltbw" Dec 02 08:01:25 crc kubenswrapper[4691]: I1202 08:01:25.905692 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-p2dmm" Dec 02 08:01:26 crc kubenswrapper[4691]: I1202 08:01:26.090223 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6tdpt" Dec 02 08:01:26 crc kubenswrapper[4691]: I1202 08:01:26.302860 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-66zbg" Dec 02 08:01:26 crc kubenswrapper[4691]: I1202 08:01:26.574825 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" Dec 02 08:01:26 crc kubenswrapper[4691]: I1202 08:01:26.574871 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-666pb" Dec 02 08:01:28 crc kubenswrapper[4691]: I1202 08:01:28.620024 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-85b7b84db-d9nql" Dec 02 08:01:31 crc kubenswrapper[4691]: E1202 08:01:31.562977 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" podUID="b42f21ed-f361-4b6f-abc7-03b237501f65" Dec 02 08:01:34 crc kubenswrapper[4691]: E1202 08:01:34.563984 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podUID="09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b" Dec 02 08:01:34 crc kubenswrapper[4691]: E1202 08:01:34.564119 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podUID="4a080540-9871-4b8c-9d74-e3f1f3cf317c" Dec 02 08:01:36 crc kubenswrapper[4691]: I1202 08:01:36.112840 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-d898r" Dec 02 08:01:39 crc kubenswrapper[4691]: I1202 08:01:39.496354 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" event={"ID":"fd33241e-270e-4dbe-b024-e368b2050ece","Type":"ContainerStarted","Data":"77b663c63adc91abf36387f01218371f0b058270b8797835d1245ed2dc2b4efc"} Dec 02 08:01:41 crc kubenswrapper[4691]: I1202 08:01:41.877584 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ws8dp"] Dec 02 08:01:41 crc kubenswrapper[4691]: I1202 08:01:41.879009 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:41 crc kubenswrapper[4691]: I1202 08:01:41.892579 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws8dp"] Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.031868 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9nk4\" (UniqueName: \"kubernetes.io/projected/fabe7745-10ed-4802-a03d-6e6c50474bb5-kube-api-access-g9nk4\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.032266 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-catalog-content\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.032304 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-utilities\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.133614 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9nk4\" (UniqueName: \"kubernetes.io/projected/fabe7745-10ed-4802-a03d-6e6c50474bb5-kube-api-access-g9nk4\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.133739 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-catalog-content\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.133796 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-utilities\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.134402 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-catalog-content\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.134429 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-utilities\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.161193 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9nk4\" (UniqueName: \"kubernetes.io/projected/fabe7745-10ed-4802-a03d-6e6c50474bb5-kube-api-access-g9nk4\") pod \"certified-operators-ws8dp\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.199064 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:01:42 crc kubenswrapper[4691]: I1202 08:01:42.866073 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ws8dp"] Dec 02 08:01:43 crc kubenswrapper[4691]: I1202 08:01:43.698451 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerStarted","Data":"e62bde5836e4e8d3974b03bf724bcb984fd9563351412d5a3cdcb51b26aaf3dc"} Dec 02 08:01:43 crc kubenswrapper[4691]: I1202 08:01:43.700042 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" event={"ID":"e6bec2f4-8aea-472b-a0f9-591b744f9fe4","Type":"ContainerStarted","Data":"347979eb1a0eb033f3679e5a071f91e625706f13ae86b4b397eb24a2f4d194bf"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.708000 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" event={"ID":"d82caf6e-e4a7-4474-8cd4-6d3f554ce608","Type":"ContainerStarted","Data":"1273cc396e17ffec23a593dbb191d9d0684b4f31ee76f1de1a979f2e358ec899"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.709909 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" event={"ID":"0667dbf1-e305-4d12-af7b-3d532a834609","Type":"ContainerStarted","Data":"d0abef95b0b7aec1f9a161e00d945e29fb101f5b39c4970674de010bd91a300b"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.711224 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" event={"ID":"759a905a-dc61-4206-862f-cb8b6f85882f","Type":"ContainerStarted","Data":"3a0443c5376bb629f14f3200806b9f475d7ac70c5ea6a92b52c8c069e9f43c3b"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.712421 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" event={"ID":"36d938bc-e3d6-4f21-8327-5f655a4ef54a","Type":"ContainerStarted","Data":"aec22ebd3eeb544bc5b4a4e39e9c9244ddeaeb275f93251eb86db8b103015719"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.713403 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" event={"ID":"5369081c-2142-4dfa-9482-b8d8d6d4195f","Type":"ContainerStarted","Data":"bee490269b1b0b2b83971b1ae9656734a7aab3265bf1e87382ac6b8cba7adc05"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.714442 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" event={"ID":"9e639d67-2200-474e-9be7-55bef7c97fe6","Type":"ContainerStarted","Data":"aad0d4ab68ccb04881298cdba47bc19cd733a4b589acd3c5a20422d0b2afa142"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.716538 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerStarted","Data":"f77ca087ca4e605ba12a52693f87610db01e957d14c898e67d78920da411e86d"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.717538 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" event={"ID":"b930ff47-307d-47b3-9b84-54e5860ee2db","Type":"ContainerStarted","Data":"ab480850233e5eb3ba395d1ab136b16c532cea7ae289e408583dee4dda23b129"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.719211 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" event={"ID":"f2666d2b-c30c-40d4-bfab-0e6d00571ecc","Type":"ContainerStarted","Data":"0b2b4e0ce5980da84c99d479c0bd44d69e046a2ab3a6ddab5802fdbdd6270d28"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.721094 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" event={"ID":"dfa342c0-60d5-4025-a881-30d706833e2b","Type":"ContainerStarted","Data":"2c0245bcf00193bdfd463a0c1b4160c5f984a8d87c49ca50717f07c9ef6b07b1"} Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.721463 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.723973 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" Dec 02 08:01:44 crc kubenswrapper[4691]: I1202 08:01:44.745896 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-nlwnv" podStartSLOduration=18.244569347 podStartE2EDuration="1m9.74587993s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.271230708 +0000 UTC m=+886.055309570" lastFinishedPulling="2025-12-02 08:01:29.772541291 +0000 UTC m=+937.556620153" observedRunningTime="2025-12-02 08:01:44.744444963 +0000 UTC m=+952.528523825" watchObservedRunningTime="2025-12-02 08:01:44.74587993 +0000 UTC m=+952.529958792" Dec 02 08:01:46 crc kubenswrapper[4691]: I1202 08:01:46.735014 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerStarted","Data":"b896023423feb764883b63bb2ae7a655a2d65b21d48887cf187f78f98c926883"} Dec 02 08:01:47 crc kubenswrapper[4691]: I1202 08:01:47.743949 4691 generic.go:334] "Generic (PLEG): container finished" podID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerID="b896023423feb764883b63bb2ae7a655a2d65b21d48887cf187f78f98c926883" exitCode=0 Dec 02 08:01:47 crc kubenswrapper[4691]: I1202 08:01:47.744023 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerDied","Data":"b896023423feb764883b63bb2ae7a655a2d65b21d48887cf187f78f98c926883"} Dec 02 08:01:47 crc kubenswrapper[4691]: I1202 08:01:47.745034 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:01:47 crc kubenswrapper[4691]: E1202 08:01:47.767027 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" podUID="5369081c-2142-4dfa-9482-b8d8d6d4195f" Dec 02 08:01:47 crc kubenswrapper[4691]: E1202 08:01:47.771080 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" podUID="9e639d67-2200-474e-9be7-55bef7c97fe6" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.756381 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" event={"ID":"dfa342c0-60d5-4025-a881-30d706833e2b","Type":"ContainerStarted","Data":"e4b1dfac5edba834d1db21124a83fe4e46463bfd993ad8cbd220398e660129af"} Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.758574 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.760129 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" event={"ID":"e8014abc-9e31-40d3-8e34-d595a8ef95b4","Type":"ContainerStarted","Data":"03c6f089f07a4642d5468d502761219402f79e7eb43707ba5330da38bdca4527"} Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.760439 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.765046 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" event={"ID":"e6bec2f4-8aea-472b-a0f9-591b744f9fe4","Type":"ContainerStarted","Data":"b5ded58da6fd660a190fe26e1d30d879ddbbd41e9587d5ef86dba1b42e3ee58d"} Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.765248 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.770973 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" event={"ID":"b930ff47-307d-47b3-9b84-54e5860ee2db","Type":"ContainerStarted","Data":"99916d1e544d4d05ddf487a7c38286a8117855489b63acbf8023239c7655332e"} Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.771683 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.772152 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.784281 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.786581 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" podStartSLOduration=20.812786452 podStartE2EDuration="1m13.786560674s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:37.798062762 +0000 UTC m=+885.582141624" lastFinishedPulling="2025-12-02 08:01:30.771836974 +0000 UTC m=+938.555915846" observedRunningTime="2025-12-02 08:01:47.767032882 +0000 UTC m=+955.551111764" watchObservedRunningTime="2025-12-02 08:01:48.786560674 +0000 UTC m=+956.570639536" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.831133 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" podStartSLOduration=4.152540535 podStartE2EDuration="1m13.831106794s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.2498411 +0000 UTC m=+886.033919962" lastFinishedPulling="2025-12-02 08:01:47.928407349 +0000 UTC m=+955.712486221" observedRunningTime="2025-12-02 08:01:48.827565405 +0000 UTC m=+956.611644277" watchObservedRunningTime="2025-12-02 08:01:48.831106794 +0000 UTC m=+956.615185656" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.837014 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" podStartSLOduration=21.371847277 podStartE2EDuration="1m13.836985502s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.359162828 +0000 UTC m=+886.143241690" lastFinishedPulling="2025-12-02 08:01:30.824301053 +0000 UTC m=+938.608379915" observedRunningTime="2025-12-02 08:01:48.796256208 +0000 UTC m=+956.580335090" watchObservedRunningTime="2025-12-02 08:01:48.836985502 +0000 UTC m=+956.621064374" Dec 02 08:01:48 crc kubenswrapper[4691]: I1202 08:01:48.925642 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" podStartSLOduration=22.359786145 podStartE2EDuration="1m13.92562673s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.196089809 +0000 UTC m=+885.980168661" lastFinishedPulling="2025-12-02 08:01:29.761930394 +0000 UTC m=+937.546009246" observedRunningTime="2025-12-02 08:01:48.924379079 +0000 UTC m=+956.708457941" watchObservedRunningTime="2025-12-02 08:01:48.92562673 +0000 UTC m=+956.709705592" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.006222 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" podStartSLOduration=21.383538741 podStartE2EDuration="1m14.006203996s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.193001331 +0000 UTC m=+885.977080203" lastFinishedPulling="2025-12-02 08:01:30.815666596 +0000 UTC m=+938.599745458" observedRunningTime="2025-12-02 08:01:49.001322093 +0000 UTC m=+956.785400965" watchObservedRunningTime="2025-12-02 08:01:49.006203996 +0000 UTC m=+956.790282848" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.007931 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jrs4t" podStartSLOduration=36.543239181 podStartE2EDuration="47.007914229s" podCreationTimestamp="2025-12-02 08:01:02 +0000 UTC" firstStartedPulling="2025-12-02 08:01:20.350646129 +0000 UTC m=+928.134724991" lastFinishedPulling="2025-12-02 08:01:30.815321177 +0000 UTC m=+938.599400039" observedRunningTime="2025-12-02 08:01:48.966402525 +0000 UTC m=+956.750481387" watchObservedRunningTime="2025-12-02 08:01:49.007914229 +0000 UTC m=+956.791993091" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.029334 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" podStartSLOduration=22.460396544 podStartE2EDuration="1m14.029309517s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.193012281 +0000 UTC m=+885.977091143" lastFinishedPulling="2025-12-02 08:01:29.761925244 +0000 UTC m=+937.546004116" observedRunningTime="2025-12-02 08:01:49.027554133 +0000 UTC m=+956.811633005" watchObservedRunningTime="2025-12-02 08:01:49.029309517 +0000 UTC m=+956.813388379" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.049998 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" podStartSLOduration=62.283406517 podStartE2EDuration="1m14.049971976s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:01:19.049153288 +0000 UTC m=+926.833232150" lastFinishedPulling="2025-12-02 08:01:30.815718747 +0000 UTC m=+938.599797609" observedRunningTime="2025-12-02 08:01:49.046685624 +0000 UTC m=+956.830764496" watchObservedRunningTime="2025-12-02 08:01:49.049971976 +0000 UTC m=+956.834050838" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.121253 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v" podStartSLOduration=62.398406889 podStartE2EDuration="1m14.121238578s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:01:19.04882839 +0000 UTC m=+926.832907252" lastFinishedPulling="2025-12-02 08:01:30.771660079 +0000 UTC m=+938.555738941" observedRunningTime="2025-12-02 08:01:49.11813808 +0000 UTC m=+956.902216942" watchObservedRunningTime="2025-12-02 08:01:49.121238578 +0000 UTC m=+956.905317430" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.183381 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" podStartSLOduration=21.143284561 podStartE2EDuration="1m14.18336375s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:37.775623748 +0000 UTC m=+885.559702610" lastFinishedPulling="2025-12-02 08:01:30.815702937 +0000 UTC m=+938.599781799" observedRunningTime="2025-12-02 08:01:49.181036652 +0000 UTC m=+956.965115514" watchObservedRunningTime="2025-12-02 08:01:49.18336375 +0000 UTC m=+956.967442612" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.780223 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" event={"ID":"b42f21ed-f361-4b6f-abc7-03b237501f65","Type":"ContainerStarted","Data":"3337e1693f071428dc3365e81afa892675bab5417465883ec733d60b550c05b1"} Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.783794 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" event={"ID":"4a080540-9871-4b8c-9d74-e3f1f3cf317c","Type":"ContainerStarted","Data":"dacdd73cacf83b59465cea030ddd1f7150183cb34c65f44c22dfe9c19a448e16"} Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.784715 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.806157 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-2kbsh" podStartSLOduration=3.4138022230000002 podStartE2EDuration="1m13.806135157s" podCreationTimestamp="2025-12-02 08:00:36 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.353966798 +0000 UTC m=+886.138045660" lastFinishedPulling="2025-12-02 08:01:48.746299732 +0000 UTC m=+956.530378594" observedRunningTime="2025-12-02 08:01:49.799531481 +0000 UTC m=+957.583610343" watchObservedRunningTime="2025-12-02 08:01:49.806135157 +0000 UTC m=+957.590214019" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.811050 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-44twc" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.830268 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" podStartSLOduration=4.261610637 podStartE2EDuration="1m14.830251263s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.358939273 +0000 UTC m=+886.143018135" lastFinishedPulling="2025-12-02 08:01:48.927579899 +0000 UTC m=+956.711658761" observedRunningTime="2025-12-02 08:01:49.829184077 +0000 UTC m=+957.613262939" watchObservedRunningTime="2025-12-02 08:01:49.830251263 +0000 UTC m=+957.614330125" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.835885 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-slsfn" Dec 02 08:01:49 crc kubenswrapper[4691]: I1202 08:01:49.855537 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-n5t7p" Dec 02 08:01:50 crc kubenswrapper[4691]: I1202 08:01:50.800236 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerStarted","Data":"d534edafa7eb3c548b0547f20990b7978173b97f1f9a9596fd20934478de3085"} Dec 02 08:01:50 crc kubenswrapper[4691]: I1202 08:01:50.802334 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" event={"ID":"5369081c-2142-4dfa-9482-b8d8d6d4195f","Type":"ContainerStarted","Data":"ff1636c8bedbfd8adbb3536871b21f09d0ad502a9180ca96336cce9da3c893e3"} Dec 02 08:01:50 crc kubenswrapper[4691]: I1202 08:01:50.832975 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" podUID="d82caf6e-e4a7-4474-8cd4-6d3f554ce608" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.81:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:01:50 crc kubenswrapper[4691]: I1202 08:01:50.866475 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" podStartSLOduration=3.457424108 podStartE2EDuration="1m15.866455244s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:37.247314845 +0000 UTC m=+885.031393707" lastFinishedPulling="2025-12-02 08:01:49.656345981 +0000 UTC m=+957.440424843" observedRunningTime="2025-12-02 08:01:50.861648813 +0000 UTC m=+958.645727675" watchObservedRunningTime="2025-12-02 08:01:50.866455244 +0000 UTC m=+958.650534106" Dec 02 08:01:52 crc kubenswrapper[4691]: I1202 08:01:52.819629 4691 generic.go:334] "Generic (PLEG): container finished" podID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerID="d534edafa7eb3c548b0547f20990b7978173b97f1f9a9596fd20934478de3085" exitCode=0 Dec 02 08:01:52 crc kubenswrapper[4691]: I1202 08:01:52.819692 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerDied","Data":"d534edafa7eb3c548b0547f20990b7978173b97f1f9a9596fd20934478de3085"} Dec 02 08:01:53 crc kubenswrapper[4691]: I1202 08:01:53.122864 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:53 crc kubenswrapper[4691]: I1202 08:01:53.122913 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.283438 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jrs4t" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="registry-server" probeResult="failure" output=< Dec 02 08:01:54 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 08:01:54 crc kubenswrapper[4691]: > Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.837921 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerStarted","Data":"80c530a378c72621deef0d7e1b0d73ce57714b0783d80e1a4642a4d3c95720b5"} Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.840354 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" event={"ID":"9e639d67-2200-474e-9be7-55bef7c97fe6","Type":"ContainerStarted","Data":"5f8af2d7644abc83d703fd5b3c4a29c8a17dcee4620dd7ea977479b96d2ff5ec"} Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.841167 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.843720 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" event={"ID":"09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b","Type":"ContainerStarted","Data":"a1b48f5ab2aa46cac9f353dfa8886ed0d93045de8dd2750f41d7e6ebfc4bd1bb"} Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.844421 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.858871 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ws8dp" podStartSLOduration=8.385903361 podStartE2EDuration="13.858856875s" podCreationTimestamp="2025-12-02 08:01:41 +0000 UTC" firstStartedPulling="2025-12-02 08:01:48.774745357 +0000 UTC m=+956.558824219" lastFinishedPulling="2025-12-02 08:01:54.247698871 +0000 UTC m=+962.031777733" observedRunningTime="2025-12-02 08:01:54.856800993 +0000 UTC m=+962.640879855" watchObservedRunningTime="2025-12-02 08:01:54.858856875 +0000 UTC m=+962.642935737" Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.881300 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" podStartSLOduration=5.218443411 podStartE2EDuration="1m19.881281139s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:38.281441064 +0000 UTC m=+886.065519926" lastFinishedPulling="2025-12-02 08:01:52.944278782 +0000 UTC m=+960.728357654" observedRunningTime="2025-12-02 08:01:54.877381691 +0000 UTC m=+962.661460553" watchObservedRunningTime="2025-12-02 08:01:54.881281139 +0000 UTC m=+962.665360001" Dec 02 08:01:54 crc kubenswrapper[4691]: I1202 08:01:54.907455 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" podStartSLOduration=3.814421693 podStartE2EDuration="1m19.907434656s" podCreationTimestamp="2025-12-02 08:00:35 +0000 UTC" firstStartedPulling="2025-12-02 08:00:37.792335538 +0000 UTC m=+885.576414400" lastFinishedPulling="2025-12-02 08:01:53.885348501 +0000 UTC m=+961.669427363" observedRunningTime="2025-12-02 08:01:54.90401782 +0000 UTC m=+962.688096672" watchObservedRunningTime="2025-12-02 08:01:54.907434656 +0000 UTC m=+962.691513518" Dec 02 08:01:55 crc kubenswrapper[4691]: I1202 08:01:55.607625 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:01:55 crc kubenswrapper[4691]: I1202 08:01:55.609984 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-w9hbv" Dec 02 08:01:55 crc kubenswrapper[4691]: I1202 08:01:55.727392 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:01:55 crc kubenswrapper[4691]: I1202 08:01:55.729875 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-n5f6s" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.043093 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-28bs2" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.050475 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.052830 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-lhkdn" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.107091 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-2g56f" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.121515 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-9p97b" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.500551 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:01:56 crc kubenswrapper[4691]: I1202 08:01:56.504177 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-wnzqd" Dec 02 08:02:02 crc kubenswrapper[4691]: I1202 08:02:02.199158 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:02:02 crc kubenswrapper[4691]: I1202 08:02:02.199785 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:02:02 crc kubenswrapper[4691]: I1202 08:02:02.243901 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:02:03 crc kubenswrapper[4691]: I1202 08:02:03.042300 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:02:03 crc kubenswrapper[4691]: I1202 08:02:03.080634 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ws8dp"] Dec 02 08:02:03 crc kubenswrapper[4691]: I1202 08:02:03.172776 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:02:03 crc kubenswrapper[4691]: I1202 08:02:03.215227 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:02:04 crc kubenswrapper[4691]: I1202 08:02:04.875702 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrs4t"] Dec 02 08:02:05 crc kubenswrapper[4691]: I1202 08:02:05.014512 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jrs4t" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="registry-server" containerID="cri-o://f77ca087ca4e605ba12a52693f87610db01e957d14c898e67d78920da411e86d" gracePeriod=2 Dec 02 08:02:05 crc kubenswrapper[4691]: I1202 08:02:05.014597 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ws8dp" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="registry-server" containerID="cri-o://80c530a378c72621deef0d7e1b0d73ce57714b0783d80e1a4642a4d3c95720b5" gracePeriod=2 Dec 02 08:02:05 crc kubenswrapper[4691]: I1202 08:02:05.658242 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-c7ftc" Dec 02 08:02:06 crc kubenswrapper[4691]: I1202 08:02:06.132688 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-gbqw2" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.031484 4691 generic.go:334] "Generic (PLEG): container finished" podID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerID="80c530a378c72621deef0d7e1b0d73ce57714b0783d80e1a4642a4d3c95720b5" exitCode=0 Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.031524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerDied","Data":"80c530a378c72621deef0d7e1b0d73ce57714b0783d80e1a4642a4d3c95720b5"} Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.033434 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerID="f77ca087ca4e605ba12a52693f87610db01e957d14c898e67d78920da411e86d" exitCode=0 Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.033461 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerDied","Data":"f77ca087ca4e605ba12a52693f87610db01e957d14c898e67d78920da411e86d"} Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.287872 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.293447 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9nk4\" (UniqueName: \"kubernetes.io/projected/fabe7745-10ed-4802-a03d-6e6c50474bb5-kube-api-access-g9nk4\") pod \"fabe7745-10ed-4802-a03d-6e6c50474bb5\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.293529 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-utilities\") pod \"fabe7745-10ed-4802-a03d-6e6c50474bb5\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.293704 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-catalog-content\") pod \"fabe7745-10ed-4802-a03d-6e6c50474bb5\" (UID: \"fabe7745-10ed-4802-a03d-6e6c50474bb5\") " Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.294846 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-utilities" (OuterVolumeSpecName: "utilities") pod "fabe7745-10ed-4802-a03d-6e6c50474bb5" (UID: "fabe7745-10ed-4802-a03d-6e6c50474bb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.302384 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fabe7745-10ed-4802-a03d-6e6c50474bb5-kube-api-access-g9nk4" (OuterVolumeSpecName: "kube-api-access-g9nk4") pod "fabe7745-10ed-4802-a03d-6e6c50474bb5" (UID: "fabe7745-10ed-4802-a03d-6e6c50474bb5"). InnerVolumeSpecName "kube-api-access-g9nk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.346897 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fabe7745-10ed-4802-a03d-6e6c50474bb5" (UID: "fabe7745-10ed-4802-a03d-6e6c50474bb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.373534 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.398110 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsq7v\" (UniqueName: \"kubernetes.io/projected/ca399b1a-df12-435c-8ebf-6f787ed29a8e-kube-api-access-rsq7v\") pod \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.398262 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-catalog-content\") pod \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.398403 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-utilities\") pod \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\" (UID: \"ca399b1a-df12-435c-8ebf-6f787ed29a8e\") " Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.398914 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.398947 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9nk4\" (UniqueName: \"kubernetes.io/projected/fabe7745-10ed-4802-a03d-6e6c50474bb5-kube-api-access-g9nk4\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.398961 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fabe7745-10ed-4802-a03d-6e6c50474bb5-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.399655 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-utilities" (OuterVolumeSpecName: "utilities") pod "ca399b1a-df12-435c-8ebf-6f787ed29a8e" (UID: "ca399b1a-df12-435c-8ebf-6f787ed29a8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.400737 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca399b1a-df12-435c-8ebf-6f787ed29a8e-kube-api-access-rsq7v" (OuterVolumeSpecName: "kube-api-access-rsq7v") pod "ca399b1a-df12-435c-8ebf-6f787ed29a8e" (UID: "ca399b1a-df12-435c-8ebf-6f787ed29a8e"). InnerVolumeSpecName "kube-api-access-rsq7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.497724 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca399b1a-df12-435c-8ebf-6f787ed29a8e" (UID: "ca399b1a-df12-435c-8ebf-6f787ed29a8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.499779 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.499801 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsq7v\" (UniqueName: \"kubernetes.io/projected/ca399b1a-df12-435c-8ebf-6f787ed29a8e-kube-api-access-rsq7v\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:07 crc kubenswrapper[4691]: I1202 08:02:07.499812 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca399b1a-df12-435c-8ebf-6f787ed29a8e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.043613 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jrs4t" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.043608 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jrs4t" event={"ID":"ca399b1a-df12-435c-8ebf-6f787ed29a8e","Type":"ContainerDied","Data":"2b9632d6c61716f54cf04ffcbec6588f4c166c5ce973a8991ac752aa3b9718af"} Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.043797 4691 scope.go:117] "RemoveContainer" containerID="f77ca087ca4e605ba12a52693f87610db01e957d14c898e67d78920da411e86d" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.047304 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ws8dp" event={"ID":"fabe7745-10ed-4802-a03d-6e6c50474bb5","Type":"ContainerDied","Data":"e62bde5836e4e8d3974b03bf724bcb984fd9563351412d5a3cdcb51b26aaf3dc"} Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.047357 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ws8dp" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.062432 4691 scope.go:117] "RemoveContainer" containerID="840593c951252ba9f9ee00e56e405164fde37a5ff4a9987d2fe1a3f50230e853" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.080943 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jrs4t"] Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.091934 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jrs4t"] Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.098895 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ws8dp"] Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.105226 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ws8dp"] Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.105632 4691 scope.go:117] "RemoveContainer" containerID="619aea510c1d1a8bb66172719818ef85cbb96788b06f0df3e5f877d2b13335e2" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.121774 4691 scope.go:117] "RemoveContainer" containerID="80c530a378c72621deef0d7e1b0d73ce57714b0783d80e1a4642a4d3c95720b5" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.137649 4691 scope.go:117] "RemoveContainer" containerID="d534edafa7eb3c548b0547f20990b7978173b97f1f9a9596fd20934478de3085" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.154049 4691 scope.go:117] "RemoveContainer" containerID="b896023423feb764883b63bb2ae7a655a2d65b21d48887cf187f78f98c926883" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.572083 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" path="/var/lib/kubelet/pods/ca399b1a-df12-435c-8ebf-6f787ed29a8e/volumes" Dec 02 08:02:08 crc kubenswrapper[4691]: I1202 08:02:08.572921 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" path="/var/lib/kubelet/pods/fabe7745-10ed-4802-a03d-6e6c50474bb5/volumes" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.376414 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fl4ml"] Dec 02 08:02:19 crc kubenswrapper[4691]: E1202 08:02:19.379002 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="extract-content" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.379027 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="extract-content" Dec 02 08:02:19 crc kubenswrapper[4691]: E1202 08:02:19.379066 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="registry-server" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.379077 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="registry-server" Dec 02 08:02:19 crc kubenswrapper[4691]: E1202 08:02:19.379129 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="registry-server" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.379137 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="registry-server" Dec 02 08:02:19 crc kubenswrapper[4691]: E1202 08:02:19.379165 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="extract-content" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.379172 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="extract-content" Dec 02 08:02:19 crc kubenswrapper[4691]: E1202 08:02:19.379191 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="extract-utilities" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.379202 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="extract-utilities" Dec 02 08:02:19 crc kubenswrapper[4691]: E1202 08:02:19.379233 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="extract-utilities" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.379240 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="extract-utilities" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.380726 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca399b1a-df12-435c-8ebf-6f787ed29a8e" containerName="registry-server" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.380778 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fabe7745-10ed-4802-a03d-6e6c50474bb5" containerName="registry-server" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.387058 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.396386 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.396642 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.397481 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.397652 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zm4h5" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.407977 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fl4ml"] Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.443919 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-x7nv2"] Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.445268 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.448806 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.466068 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-x7nv2"] Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.471163 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-config\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.471220 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-config\") pod \"dnsmasq-dns-675f4bcbfc-fl4ml\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.471439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzss9\" (UniqueName: \"kubernetes.io/projected/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-kube-api-access-kzss9\") pod \"dnsmasq-dns-675f4bcbfc-fl4ml\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.471525 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.471554 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgvsf\" (UniqueName: \"kubernetes.io/projected/0856da41-022a-43d2-acb6-6817e256dea2-kube-api-access-lgvsf\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.573257 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.573575 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgvsf\" (UniqueName: \"kubernetes.io/projected/0856da41-022a-43d2-acb6-6817e256dea2-kube-api-access-lgvsf\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.573802 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-config\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.573915 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-config\") pod \"dnsmasq-dns-675f4bcbfc-fl4ml\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.574075 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzss9\" (UniqueName: \"kubernetes.io/projected/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-kube-api-access-kzss9\") pod \"dnsmasq-dns-675f4bcbfc-fl4ml\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.574202 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.574751 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-config\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.575056 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-config\") pod \"dnsmasq-dns-675f4bcbfc-fl4ml\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.599085 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzss9\" (UniqueName: \"kubernetes.io/projected/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-kube-api-access-kzss9\") pod \"dnsmasq-dns-675f4bcbfc-fl4ml\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.599914 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgvsf\" (UniqueName: \"kubernetes.io/projected/0856da41-022a-43d2-acb6-6817e256dea2-kube-api-access-lgvsf\") pod \"dnsmasq-dns-78dd6ddcc-x7nv2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.741055 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:19 crc kubenswrapper[4691]: I1202 08:02:19.774658 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:20 crc kubenswrapper[4691]: I1202 08:02:20.222842 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fl4ml"] Dec 02 08:02:20 crc kubenswrapper[4691]: I1202 08:02:20.284005 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-x7nv2"] Dec 02 08:02:20 crc kubenswrapper[4691]: W1202 08:02:20.285294 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0856da41_022a_43d2_acb6_6817e256dea2.slice/crio-a275491e70c28f0888d5878bf6f35db324d59a4ebceb4866dabdc3d7f072b5f6 WatchSource:0}: Error finding container a275491e70c28f0888d5878bf6f35db324d59a4ebceb4866dabdc3d7f072b5f6: Status 404 returned error can't find the container with id a275491e70c28f0888d5878bf6f35db324d59a4ebceb4866dabdc3d7f072b5f6 Dec 02 08:02:21 crc kubenswrapper[4691]: I1202 08:02:21.141100 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" event={"ID":"0856da41-022a-43d2-acb6-6817e256dea2","Type":"ContainerStarted","Data":"a275491e70c28f0888d5878bf6f35db324d59a4ebceb4866dabdc3d7f072b5f6"} Dec 02 08:02:21 crc kubenswrapper[4691]: I1202 08:02:21.142668 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" event={"ID":"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd","Type":"ContainerStarted","Data":"00983e1048e5035be4331971d178bf572fe7e3f3baa933e604e316ecfd653eec"} Dec 02 08:02:21 crc kubenswrapper[4691]: I1202 08:02:21.899245 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:02:21 crc kubenswrapper[4691]: I1202 08:02:21.899332 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.336225 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fl4ml"] Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.356324 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x75lc"] Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.358194 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.372975 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x75lc"] Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.426076 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4zb4\" (UniqueName: \"kubernetes.io/projected/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-kube-api-access-l4zb4\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.426164 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-config\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.426184 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-dns-svc\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.527842 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4zb4\" (UniqueName: \"kubernetes.io/projected/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-kube-api-access-l4zb4\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.527958 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-config\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.527982 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-dns-svc\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.529389 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-config\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.529419 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-dns-svc\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.564882 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4zb4\" (UniqueName: \"kubernetes.io/projected/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-kube-api-access-l4zb4\") pod \"dnsmasq-dns-666b6646f7-x75lc\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.631416 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-x7nv2"] Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.696494 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.712087 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v7b8"] Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.716739 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.724412 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v7b8"] Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.741721 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.741828 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl9qg\" (UniqueName: \"kubernetes.io/projected/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-kube-api-access-rl9qg\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.741879 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-config\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.843662 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl9qg\" (UniqueName: \"kubernetes.io/projected/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-kube-api-access-rl9qg\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.843730 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-config\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.843798 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.844664 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.844698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-config\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:22 crc kubenswrapper[4691]: I1202 08:02:22.867698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl9qg\" (UniqueName: \"kubernetes.io/projected/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-kube-api-access-rl9qg\") pod \"dnsmasq-dns-57d769cc4f-2v7b8\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.110428 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.212561 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x75lc"] Dec 02 08:02:23 crc kubenswrapper[4691]: W1202 08:02:23.224209 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa5bdbdd_dfe9_4d04_a8d1_2288e58d4f04.slice/crio-a0b30c9ef3b71fbd78758ca4f2c3c70d701cc6bf937126d042ae5d0fef296abc WatchSource:0}: Error finding container a0b30c9ef3b71fbd78758ca4f2c3c70d701cc6bf937126d042ae5d0fef296abc: Status 404 returned error can't find the container with id a0b30c9ef3b71fbd78758ca4f2c3c70d701cc6bf937126d042ae5d0fef296abc Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.478374 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.479717 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.485980 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.486048 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g4k6l" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.486302 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.486553 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.486678 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.486786 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.489029 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.494401 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554084 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554288 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554361 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ed4ad29-5963-47aa-ba01-faf16686c61d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554403 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ed4ad29-5963-47aa-ba01-faf16686c61d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554509 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554592 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrs88\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-kube-api-access-vrs88\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554627 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-config-data\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554814 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.554856 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.611243 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v7b8"] Dec 02 08:02:23 crc kubenswrapper[4691]: W1202 08:02:23.627410 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod257ff17d_9e9e_4bda_a93b_8f419bd07ef3.slice/crio-bfd2c940218225a605b84fc3f9722e818d0938a682804257fadbd41a2df5280c WatchSource:0}: Error finding container bfd2c940218225a605b84fc3f9722e818d0938a682804257fadbd41a2df5280c: Status 404 returned error can't find the container with id bfd2c940218225a605b84fc3f9722e818d0938a682804257fadbd41a2df5280c Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.656258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.657522 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.657715 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658111 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658060 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658386 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658436 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658474 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ed4ad29-5963-47aa-ba01-faf16686c61d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658536 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ed4ad29-5963-47aa-ba01-faf16686c61d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658580 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658600 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658636 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrs88\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-kube-api-access-vrs88\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.658653 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-config-data\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.659163 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.659488 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.659700 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.663258 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-config-data\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.663851 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ed4ad29-5963-47aa-ba01-faf16686c61d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.665415 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ed4ad29-5963-47aa-ba01-faf16686c61d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.666431 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.675169 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.677598 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrs88\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-kube-api-access-vrs88\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.692368 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.786088 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.787859 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.790936 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.791070 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.791237 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.791357 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dxjbx" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.791466 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.791593 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.791700 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.797778 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.817276 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjv8f\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-kube-api-access-zjv8f\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962268 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962292 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962316 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962337 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962528 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ce7acb7-a140-4c78-a71d-d3c96aa12651-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962573 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962667 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962704 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962777 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ce7acb7-a140-4c78-a71d-d3c96aa12651-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:23 crc kubenswrapper[4691]: I1202 08:02:23.962809 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064223 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064264 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ce7acb7-a140-4c78-a71d-d3c96aa12651-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064293 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064352 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjv8f\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-kube-api-access-zjv8f\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064393 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064416 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064451 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064483 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064530 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ce7acb7-a140-4c78-a71d-d3c96aa12651-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064598 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.064834 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.065425 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.065746 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.065826 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.066686 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.067089 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.075542 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ce7acb7-a140-4c78-a71d-d3c96aa12651-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.080512 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ce7acb7-a140-4c78-a71d-d3c96aa12651-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.083802 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.088864 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.096832 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjv8f\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-kube-api-access-zjv8f\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.101464 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.125343 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.173030 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" event={"ID":"257ff17d-9e9e-4bda-a93b-8f419bd07ef3","Type":"ContainerStarted","Data":"bfd2c940218225a605b84fc3f9722e818d0938a682804257fadbd41a2df5280c"} Dec 02 08:02:24 crc kubenswrapper[4691]: I1202 08:02:24.174614 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" event={"ID":"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04","Type":"ContainerStarted","Data":"a0b30c9ef3b71fbd78758ca4f2c3c70d701cc6bf937126d042ae5d0fef296abc"} Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.102533 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.167439 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.173506 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.177096 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.178058 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.178204 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-qzt24" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.185547 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.191518 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285336 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285376 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285422 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285457 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285516 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285544 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285563 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7cj\" (UniqueName: \"kubernetes.io/projected/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-kube-api-access-fz7cj\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.285584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389699 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389791 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389818 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7cj\" (UniqueName: \"kubernetes.io/projected/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-kube-api-access-fz7cj\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389873 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389909 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389945 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.389975 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.390084 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.391115 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.392026 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.392481 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-kolla-config\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.393355 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-config-data-default\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.409693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.423255 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.427665 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7cj\" (UniqueName: \"kubernetes.io/projected/aa4f9395-a46a-40e4-a80c-c9b43caadc0b-kube-api-access-fz7cj\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.468002 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-galera-0\" (UID: \"aa4f9395-a46a-40e4-a80c-c9b43caadc0b\") " pod="openstack/openstack-galera-0" Dec 02 08:02:25 crc kubenswrapper[4691]: I1202 08:02:25.503136 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.653383 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.654861 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.660402 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-hjzzq" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.664380 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.664388 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.664455 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.673525 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.815981 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816024 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816045 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816068 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816100 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf6nx\" (UniqueName: \"kubernetes.io/projected/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-kube-api-access-pf6nx\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816119 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816144 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.816178 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.906327 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.907639 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.910017 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jkbxs" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.911450 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.911626 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917029 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917125 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917147 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917169 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917191 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917218 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf6nx\" (UniqueName: \"kubernetes.io/projected/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-kube-api-access-pf6nx\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917236 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.917397 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.918648 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.919048 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.920233 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.921081 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.935651 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.936276 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.951211 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.958024 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf6nx\" (UniqueName: \"kubernetes.io/projected/f14bc2d4-ce0c-440d-9e1d-15b0b8716562-kube-api-access-pf6nx\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.960879 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"f14bc2d4-ce0c-440d-9e1d-15b0b8716562\") " pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:26 crc kubenswrapper[4691]: I1202 08:02:26.980734 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.022784 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dfk2\" (UniqueName: \"kubernetes.io/projected/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-kube-api-access-2dfk2\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.022828 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.022906 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-config-data\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.022954 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.022984 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-kolla-config\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.124014 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-config-data\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.124086 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.124153 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-kolla-config\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.124232 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dfk2\" (UniqueName: \"kubernetes.io/projected/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-kube-api-access-2dfk2\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.124266 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.126134 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-config-data\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.127355 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-kolla-config\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.129482 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-combined-ca-bundle\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.129905 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-memcached-tls-certs\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.142083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dfk2\" (UniqueName: \"kubernetes.io/projected/b3c7c69c-4fd9-4483-89b7-202f766ce6e5-kube-api-access-2dfk2\") pod \"memcached-0\" (UID: \"b3c7c69c-4fd9-4483-89b7-202f766ce6e5\") " pod="openstack/memcached-0" Dec 02 08:02:27 crc kubenswrapper[4691]: I1202 08:02:27.304139 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.652206 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.653628 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.657548 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-mqrj8" Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.672234 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.749444 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdsst\" (UniqueName: \"kubernetes.io/projected/1473d859-fd02-490b-a906-bf8136cb422c-kube-api-access-hdsst\") pod \"kube-state-metrics-0\" (UID: \"1473d859-fd02-490b-a906-bf8136cb422c\") " pod="openstack/kube-state-metrics-0" Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.851564 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdsst\" (UniqueName: \"kubernetes.io/projected/1473d859-fd02-490b-a906-bf8136cb422c-kube-api-access-hdsst\") pod \"kube-state-metrics-0\" (UID: \"1473d859-fd02-490b-a906-bf8136cb422c\") " pod="openstack/kube-state-metrics-0" Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.872721 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdsst\" (UniqueName: \"kubernetes.io/projected/1473d859-fd02-490b-a906-bf8136cb422c-kube-api-access-hdsst\") pod \"kube-state-metrics-0\" (UID: \"1473d859-fd02-490b-a906-bf8136cb422c\") " pod="openstack/kube-state-metrics-0" Dec 02 08:02:28 crc kubenswrapper[4691]: I1202 08:02:28.983891 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.852574 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9jwww"] Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.854237 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jwww" Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.858350 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-pmqdf" Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.858477 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.858570 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.872805 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-twpkf"] Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.880100 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.892949 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jwww"] Dec 02 08:02:32 crc kubenswrapper[4691]: I1202 08:02:32.903571 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-twpkf"] Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fx9z\" (UniqueName: \"kubernetes.io/projected/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-kube-api-access-5fx9z\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026320 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-ovn-controller-tls-certs\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026363 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-run\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026389 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-etc-ovs\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026416 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-log-ovn\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026433 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-lib\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026450 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-log\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026466 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-scripts\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026499 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-run-ovn\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026538 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-run\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026561 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-scripts\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026580 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7mc2\" (UniqueName: \"kubernetes.io/projected/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-kube-api-access-q7mc2\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.026601 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-combined-ca-bundle\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127659 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7mc2\" (UniqueName: \"kubernetes.io/projected/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-kube-api-access-q7mc2\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127712 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-combined-ca-bundle\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127736 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fx9z\" (UniqueName: \"kubernetes.io/projected/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-kube-api-access-5fx9z\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127760 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-ovn-controller-tls-certs\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127816 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-run\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127839 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-etc-ovs\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127868 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-log-ovn\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127886 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-lib\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127902 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-scripts\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127916 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-log\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.127948 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-run-ovn\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.128000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-run\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.128022 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-scripts\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.129015 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-etc-ovs\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.130200 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-scripts\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.130409 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-log\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.130547 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-run-ovn\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.130627 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-run\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.130650 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-run\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.130837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-var-log-ovn\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.131081 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-var-lib\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.139101 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-combined-ca-bundle\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.140192 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-scripts\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.142519 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-ovn-controller-tls-certs\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.147170 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fx9z\" (UniqueName: \"kubernetes.io/projected/ed3b8ff0-c19d-4614-abe4-0ad6b5801b78-kube-api-access-5fx9z\") pod \"ovn-controller-9jwww\" (UID: \"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78\") " pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.148646 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7mc2\" (UniqueName: \"kubernetes.io/projected/41e5a6f8-bc4a-43d6-b49b-d065f6cef159-kube-api-access-q7mc2\") pod \"ovn-controller-ovs-twpkf\" (UID: \"41e5a6f8-bc4a-43d6-b49b-d065f6cef159\") " pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.189782 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jwww" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.213910 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.978072 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.980711 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.984249 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.986264 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.986339 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-5gv2s" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.986593 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.986796 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 02 08:02:33 crc kubenswrapper[4691]: I1202 08:02:33.986925 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.165948 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.165988 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.166070 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb0266-7a9e-4e28-810c-7136d8336f1b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.166098 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb0266-7a9e-4e28-810c-7136d8336f1b-config\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.166119 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.166580 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.166627 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb0266-7a9e-4e28-810c-7136d8336f1b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.166664 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc6rm\" (UniqueName: \"kubernetes.io/projected/4edb0266-7a9e-4e28-810c-7136d8336f1b-kube-api-access-wc6rm\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.267714 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc6rm\" (UniqueName: \"kubernetes.io/projected/4edb0266-7a9e-4e28-810c-7136d8336f1b-kube-api-access-wc6rm\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.267820 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.267858 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.267934 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb0266-7a9e-4e28-810c-7136d8336f1b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.267976 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb0266-7a9e-4e28-810c-7136d8336f1b-config\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.268003 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.268040 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.268088 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb0266-7a9e-4e28-810c-7136d8336f1b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.268560 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4edb0266-7a9e-4e28-810c-7136d8336f1b-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.268864 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.269451 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4edb0266-7a9e-4e28-810c-7136d8336f1b-config\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.269638 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4edb0266-7a9e-4e28-810c-7136d8336f1b-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.275699 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.288175 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.290958 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc6rm\" (UniqueName: \"kubernetes.io/projected/4edb0266-7a9e-4e28-810c-7136d8336f1b-kube-api-access-wc6rm\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.297185 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.304385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4edb0266-7a9e-4e28-810c-7136d8336f1b-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4edb0266-7a9e-4e28-810c-7136d8336f1b\") " pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:34 crc kubenswrapper[4691]: I1202 08:02:34.325200 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.154979 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.158258 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.160440 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-qngtv" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.161735 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.162113 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.162322 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.164319 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.302955 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303022 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303057 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc6zq\" (UniqueName: \"kubernetes.io/projected/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-kube-api-access-wc6zq\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303084 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303104 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303123 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303189 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-config\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.303210 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405567 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405663 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc6zq\" (UniqueName: \"kubernetes.io/projected/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-kube-api-access-wc6zq\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405697 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405722 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405745 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405837 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-config\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405861 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.405904 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.407040 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.407426 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.407736 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-config\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.407791 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.413256 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.413465 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.420789 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.427376 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc6zq\" (UniqueName: \"kubernetes.io/projected/9dfe73d8-e7c6-4906-bb6c-64c13435c53f-kube-api-access-wc6zq\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.443909 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"9dfe73d8-e7c6-4906-bb6c-64c13435c53f\") " pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:37 crc kubenswrapper[4691]: I1202 08:02:37.494617 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 02 08:02:39 crc kubenswrapper[4691]: I1202 08:02:39.202909 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 02 08:02:43 crc kubenswrapper[4691]: E1202 08:02:43.586212 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 08:02:43 crc kubenswrapper[4691]: E1202 08:02:43.587294 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kzss9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fl4ml_openstack(c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:02:43 crc kubenswrapper[4691]: E1202 08:02:43.588913 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" podUID="c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd" Dec 02 08:02:43 crc kubenswrapper[4691]: E1202 08:02:43.660121 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 02 08:02:43 crc kubenswrapper[4691]: E1202 08:02:43.660605 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lgvsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-x7nv2_openstack(0856da41-022a-43d2-acb6-6817e256dea2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:02:43 crc kubenswrapper[4691]: E1202 08:02:43.662255 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" podUID="0856da41-022a-43d2-acb6-6817e256dea2" Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.218599 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.346322 4691 generic.go:334] "Generic (PLEG): container finished" podID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerID="2f468fd1a74fe9942adb2fdce1cc4081220fe8a0562cbc95a089231dac87a337" exitCode=0 Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.346411 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" event={"ID":"257ff17d-9e9e-4bda-a93b-8f419bd07ef3","Type":"ContainerDied","Data":"2f468fd1a74fe9942adb2fdce1cc4081220fe8a0562cbc95a089231dac87a337"} Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.347531 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b3c7c69c-4fd9-4483-89b7-202f766ce6e5","Type":"ContainerStarted","Data":"0343c00c18df1dc760f54b3147bfe1dac14d1c8a6ac0a6bb8d499c9b6059aab4"} Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.349739 4691 generic.go:334] "Generic (PLEG): container finished" podID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerID="8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da" exitCode=0 Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.349861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" event={"ID":"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04","Type":"ContainerDied","Data":"8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da"} Dec 02 08:02:44 crc kubenswrapper[4691]: E1202 08:02:44.570408 4691 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Dec 02 08:02:44 crc kubenswrapper[4691]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 08:02:44 crc kubenswrapper[4691]: > podSandboxID="a0b30c9ef3b71fbd78758ca4f2c3c70d701cc6bf937126d042ae5d0fef296abc" Dec 02 08:02:44 crc kubenswrapper[4691]: E1202 08:02:44.570961 4691 kuberuntime_manager.go:1274] "Unhandled Error" err=< Dec 02 08:02:44 crc kubenswrapper[4691]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4zb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-x75lc_openstack(fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Dec 02 08:02:44 crc kubenswrapper[4691]: > logger="UnhandledError" Dec 02 08:02:44 crc kubenswrapper[4691]: E1202 08:02:44.573167 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.631902 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.799684 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.812517 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.825984 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 02 08:02:44 crc kubenswrapper[4691]: W1202 08:02:44.828508 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1473d859_fd02_490b_a906_bf8136cb422c.slice/crio-3b7b0d2b4a5af31b3ddc277a56220386a0018ba3fde17bcb9b6b2c124c96240d WatchSource:0}: Error finding container 3b7b0d2b4a5af31b3ddc277a56220386a0018ba3fde17bcb9b6b2c124c96240d: Status 404 returned error can't find the container with id 3b7b0d2b4a5af31b3ddc277a56220386a0018ba3fde17bcb9b6b2c124c96240d Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.837576 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.961966 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:44 crc kubenswrapper[4691]: I1202 08:02:44.974843 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.088479 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-config\") pod \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.088573 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgvsf\" (UniqueName: \"kubernetes.io/projected/0856da41-022a-43d2-acb6-6817e256dea2-kube-api-access-lgvsf\") pod \"0856da41-022a-43d2-acb6-6817e256dea2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.088624 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-dns-svc\") pod \"0856da41-022a-43d2-acb6-6817e256dea2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.088814 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-config\") pod \"0856da41-022a-43d2-acb6-6817e256dea2\" (UID: \"0856da41-022a-43d2-acb6-6817e256dea2\") " Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.088866 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzss9\" (UniqueName: \"kubernetes.io/projected/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-kube-api-access-kzss9\") pod \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\" (UID: \"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd\") " Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.090615 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-config" (OuterVolumeSpecName: "config") pod "c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd" (UID: "c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.090624 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-config" (OuterVolumeSpecName: "config") pod "0856da41-022a-43d2-acb6-6817e256dea2" (UID: "0856da41-022a-43d2-acb6-6817e256dea2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.090676 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0856da41-022a-43d2-acb6-6817e256dea2" (UID: "0856da41-022a-43d2-acb6-6817e256dea2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.097718 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0856da41-022a-43d2-acb6-6817e256dea2-kube-api-access-lgvsf" (OuterVolumeSpecName: "kube-api-access-lgvsf") pod "0856da41-022a-43d2-acb6-6817e256dea2" (UID: "0856da41-022a-43d2-acb6-6817e256dea2"). InnerVolumeSpecName "kube-api-access-lgvsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.099376 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-kube-api-access-kzss9" (OuterVolumeSpecName: "kube-api-access-kzss9") pod "c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd" (UID: "c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd"). InnerVolumeSpecName "kube-api-access-kzss9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.155604 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 02 08:02:45 crc kubenswrapper[4691]: W1202 08:02:45.163351 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9dfe73d8_e7c6_4906_bb6c_64c13435c53f.slice/crio-97b862f017079efcdc9d6360c978734dbed29f251b4740df574b5a83b2ebcae2 WatchSource:0}: Error finding container 97b862f017079efcdc9d6360c978734dbed29f251b4740df574b5a83b2ebcae2: Status 404 returned error can't find the container with id 97b862f017079efcdc9d6360c978734dbed29f251b4740df574b5a83b2ebcae2 Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.165327 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jwww"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.191003 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgvsf\" (UniqueName: \"kubernetes.io/projected/0856da41-022a-43d2-acb6-6817e256dea2-kube-api-access-lgvsf\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.191320 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.191333 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0856da41-022a-43d2-acb6-6817e256dea2-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.191346 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzss9\" (UniqueName: \"kubernetes.io/projected/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-kube-api-access-kzss9\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.191360 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.268773 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-twpkf"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.367926 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f14bc2d4-ce0c-440d-9e1d-15b0b8716562","Type":"ContainerStarted","Data":"5f32427e825cdd92c0ee1687001a72b57cc8d22e0cbcf280b6a59f8678b782b8"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.369681 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9dfe73d8-e7c6-4906-bb6c-64c13435c53f","Type":"ContainerStarted","Data":"97b862f017079efcdc9d6360c978734dbed29f251b4740df574b5a83b2ebcae2"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.371529 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ce7acb7-a140-4c78-a71d-d3c96aa12651","Type":"ContainerStarted","Data":"12d830600803b83d095461705a6f4f4eb4cec000d454d8a8d98a9c68d35b9e66"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.376335 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1473d859-fd02-490b-a906-bf8136cb422c","Type":"ContainerStarted","Data":"3b7b0d2b4a5af31b3ddc277a56220386a0018ba3fde17bcb9b6b2c124c96240d"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.379478 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aa4f9395-a46a-40e4-a80c-c9b43caadc0b","Type":"ContainerStarted","Data":"a7e8195bfd4cda8fbcbdc87accefadff94a15be9874413d60627d625903d2978"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.382391 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-twpkf" event={"ID":"41e5a6f8-bc4a-43d6-b49b-d065f6cef159","Type":"ContainerStarted","Data":"40220e0b811fa2bafb1802586ac40973e5d9d73853e6e640b119690267f9c996"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.384034 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4edb0266-7a9e-4e28-810c-7136d8336f1b","Type":"ContainerStarted","Data":"0d98b798adb309a1d8088a0ade9aa4c5101e125724edb0820d7c7424ffd8cec0"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.386487 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ed4ad29-5963-47aa-ba01-faf16686c61d","Type":"ContainerStarted","Data":"b4d00eb44634459e3e3dda851a5a0bda76903b9f4b467bef070a6776eb4dafb9"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.388723 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jwww" event={"ID":"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78","Type":"ContainerStarted","Data":"69ae04962f5a3617fbe3ee44f0d5438ae2285bbf245f71716e64b868bde77c14"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.391545 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" event={"ID":"257ff17d-9e9e-4bda-a93b-8f419bd07ef3","Type":"ContainerStarted","Data":"f211cbe91126f22ebaa8bfafe83c926efab30597eace8b092fd98645fe3303b8"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.391623 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.392895 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" event={"ID":"c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd","Type":"ContainerDied","Data":"00983e1048e5035be4331971d178bf572fe7e3f3baa933e604e316ecfd653eec"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.392924 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fl4ml" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.395381 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" event={"ID":"0856da41-022a-43d2-acb6-6817e256dea2","Type":"ContainerDied","Data":"a275491e70c28f0888d5878bf6f35db324d59a4ebceb4866dabdc3d7f072b5f6"} Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.395798 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-x7nv2" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.419154 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" podStartSLOduration=3.300415729 podStartE2EDuration="23.419128344s" podCreationTimestamp="2025-12-02 08:02:22 +0000 UTC" firstStartedPulling="2025-12-02 08:02:23.630235725 +0000 UTC m=+991.414314587" lastFinishedPulling="2025-12-02 08:02:43.74894833 +0000 UTC m=+1011.533027202" observedRunningTime="2025-12-02 08:02:45.412849256 +0000 UTC m=+1013.196928118" watchObservedRunningTime="2025-12-02 08:02:45.419128344 +0000 UTC m=+1013.203207206" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.490121 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fl4ml"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.503828 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fl4ml"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.513981 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-mjcrr"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.515482 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.528018 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.598122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdf429e-b93d-4009-aaa1-1c45a0083363-combined-ca-bundle\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.598184 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cdf429e-b93d-4009-aaa1-1c45a0083363-config\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.598295 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0cdf429e-b93d-4009-aaa1-1c45a0083363-ovn-rundir\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.598374 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlxwn\" (UniqueName: \"kubernetes.io/projected/0cdf429e-b93d-4009-aaa1-1c45a0083363-kube-api-access-hlxwn\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.598439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cdf429e-b93d-4009-aaa1-1c45a0083363-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.598513 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0cdf429e-b93d-4009-aaa1-1c45a0083363-ovs-rundir\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.619536 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mjcrr"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.685187 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-x7nv2"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.700697 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0cdf429e-b93d-4009-aaa1-1c45a0083363-ovn-rundir\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.700752 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlxwn\" (UniqueName: \"kubernetes.io/projected/0cdf429e-b93d-4009-aaa1-1c45a0083363-kube-api-access-hlxwn\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.700801 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cdf429e-b93d-4009-aaa1-1c45a0083363-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.700840 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0cdf429e-b93d-4009-aaa1-1c45a0083363-ovs-rundir\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.700907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdf429e-b93d-4009-aaa1-1c45a0083363-combined-ca-bundle\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.700930 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cdf429e-b93d-4009-aaa1-1c45a0083363-config\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.701716 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0cdf429e-b93d-4009-aaa1-1c45a0083363-config\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.704114 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/0cdf429e-b93d-4009-aaa1-1c45a0083363-ovs-rundir\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.705169 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/0cdf429e-b93d-4009-aaa1-1c45a0083363-ovn-rundir\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.718343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdf429e-b93d-4009-aaa1-1c45a0083363-combined-ca-bundle\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.719028 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-x7nv2"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.724535 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlxwn\" (UniqueName: \"kubernetes.io/projected/0cdf429e-b93d-4009-aaa1-1c45a0083363-kube-api-access-hlxwn\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.729039 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0cdf429e-b93d-4009-aaa1-1c45a0083363-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-mjcrr\" (UID: \"0cdf429e-b93d-4009-aaa1-1c45a0083363\") " pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.815502 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v7b8"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.865424 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m866m"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.868741 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.872003 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.876745 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-mjcrr" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.932876 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m866m"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.951476 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x75lc"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.969782 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-nmfrn"] Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.972295 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:45 crc kubenswrapper[4691]: I1202 08:02:45.974599 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.013553 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-nmfrn"] Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.022485 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-config\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.022641 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74jrj\" (UniqueName: \"kubernetes.io/projected/412c80a3-bf18-4eef-a1f6-440b969e8b28-kube-api-access-74jrj\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.022723 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.022757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.124458 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.124533 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-config\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.124618 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr8pk\" (UniqueName: \"kubernetes.io/projected/0f9bb2b1-116d-48e6-a123-ac026f53023e-kube-api-access-vr8pk\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.124669 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.124729 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.124977 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74jrj\" (UniqueName: \"kubernetes.io/projected/412c80a3-bf18-4eef-a1f6-440b969e8b28-kube-api-access-74jrj\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.125139 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.125338 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-config\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.125675 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.125923 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.126002 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-config\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.126788 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.150361 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74jrj\" (UniqueName: \"kubernetes.io/projected/412c80a3-bf18-4eef-a1f6-440b969e8b28-kube-api-access-74jrj\") pod \"dnsmasq-dns-7f896c8c65-m866m\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.227825 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr8pk\" (UniqueName: \"kubernetes.io/projected/0f9bb2b1-116d-48e6-a123-ac026f53023e-kube-api-access-vr8pk\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.227955 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.228158 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.230103 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.230205 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-config\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.231166 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.231191 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.231216 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-config\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.231435 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.236946 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.250397 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr8pk\" (UniqueName: \"kubernetes.io/projected/0f9bb2b1-116d-48e6-a123-ac026f53023e-kube-api-access-vr8pk\") pod \"dnsmasq-dns-86db49b7ff-nmfrn\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.304737 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.574203 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0856da41-022a-43d2-acb6-6817e256dea2" path="/var/lib/kubelet/pods/0856da41-022a-43d2-acb6-6817e256dea2/volumes" Dec 02 08:02:46 crc kubenswrapper[4691]: I1202 08:02:46.574661 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd" path="/var/lib/kubelet/pods/c4e48b63-b66b-4a4a-adb0-aac3b2c9c2cd/volumes" Dec 02 08:02:47 crc kubenswrapper[4691]: I1202 08:02:47.457567 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerName="dnsmasq-dns" containerID="cri-o://f211cbe91126f22ebaa8bfafe83c926efab30597eace8b092fd98645fe3303b8" gracePeriod=10 Dec 02 08:02:48 crc kubenswrapper[4691]: I1202 08:02:48.466968 4691 generic.go:334] "Generic (PLEG): container finished" podID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerID="f211cbe91126f22ebaa8bfafe83c926efab30597eace8b092fd98645fe3303b8" exitCode=0 Dec 02 08:02:48 crc kubenswrapper[4691]: I1202 08:02:48.467058 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" event={"ID":"257ff17d-9e9e-4bda-a93b-8f419bd07ef3","Type":"ContainerDied","Data":"f211cbe91126f22ebaa8bfafe83c926efab30597eace8b092fd98645fe3303b8"} Dec 02 08:02:49 crc kubenswrapper[4691]: I1202 08:02:49.925048 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.008068 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-config\") pod \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.008118 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-dns-svc\") pod \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.008276 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rl9qg\" (UniqueName: \"kubernetes.io/projected/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-kube-api-access-rl9qg\") pod \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\" (UID: \"257ff17d-9e9e-4bda-a93b-8f419bd07ef3\") " Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.014127 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-kube-api-access-rl9qg" (OuterVolumeSpecName: "kube-api-access-rl9qg") pod "257ff17d-9e9e-4bda-a93b-8f419bd07ef3" (UID: "257ff17d-9e9e-4bda-a93b-8f419bd07ef3"). InnerVolumeSpecName "kube-api-access-rl9qg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.044545 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "257ff17d-9e9e-4bda-a93b-8f419bd07ef3" (UID: "257ff17d-9e9e-4bda-a93b-8f419bd07ef3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.048878 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-config" (OuterVolumeSpecName: "config") pod "257ff17d-9e9e-4bda-a93b-8f419bd07ef3" (UID: "257ff17d-9e9e-4bda-a93b-8f419bd07ef3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.110933 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.110971 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.110987 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rl9qg\" (UniqueName: \"kubernetes.io/projected/257ff17d-9e9e-4bda-a93b-8f419bd07ef3-kube-api-access-rl9qg\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.487308 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" event={"ID":"257ff17d-9e9e-4bda-a93b-8f419bd07ef3","Type":"ContainerDied","Data":"bfd2c940218225a605b84fc3f9722e818d0938a682804257fadbd41a2df5280c"} Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.487369 4691 scope.go:117] "RemoveContainer" containerID="f211cbe91126f22ebaa8bfafe83c926efab30597eace8b092fd98645fe3303b8" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.487407 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v7b8" Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.530028 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v7b8"] Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.537102 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v7b8"] Dec 02 08:02:50 crc kubenswrapper[4691]: I1202 08:02:50.573546 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" path="/var/lib/kubelet/pods/257ff17d-9e9e-4bda-a93b-8f419bd07ef3/volumes" Dec 02 08:02:51 crc kubenswrapper[4691]: I1202 08:02:51.898600 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:02:51 crc kubenswrapper[4691]: I1202 08:02:51.898710 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:02:53 crc kubenswrapper[4691]: I1202 08:02:53.530898 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-mjcrr"] Dec 02 08:02:53 crc kubenswrapper[4691]: I1202 08:02:53.700215 4691 scope.go:117] "RemoveContainer" containerID="2f468fd1a74fe9942adb2fdce1cc4081220fe8a0562cbc95a089231dac87a337" Dec 02 08:02:53 crc kubenswrapper[4691]: W1202 08:02:53.705465 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cdf429e_b93d_4009_aaa1_1c45a0083363.slice/crio-01f39b08cdf3a6cf2e81bccaa84356210feae986bffaf2896cd2abacce82858d WatchSource:0}: Error finding container 01f39b08cdf3a6cf2e81bccaa84356210feae986bffaf2896cd2abacce82858d: Status 404 returned error can't find the container with id 01f39b08cdf3a6cf2e81bccaa84356210feae986bffaf2896cd2abacce82858d Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.025781 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-nmfrn"] Dec 02 08:02:54 crc kubenswrapper[4691]: W1202 08:02:54.046727 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f9bb2b1_116d_48e6_a123_ac026f53023e.slice/crio-34a6cdc32ca985bc2fc9634a8caf2a48d1f4da54f318f439a0269deb514f3f4e WatchSource:0}: Error finding container 34a6cdc32ca985bc2fc9634a8caf2a48d1f4da54f318f439a0269deb514f3f4e: Status 404 returned error can't find the container with id 34a6cdc32ca985bc2fc9634a8caf2a48d1f4da54f318f439a0269deb514f3f4e Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.309027 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m866m"] Dec 02 08:02:54 crc kubenswrapper[4691]: W1202 08:02:54.345786 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod412c80a3_bf18_4eef_a1f6_440b969e8b28.slice/crio-d31c1c8fb38c3a2fb491c377fe3c5fcdb341c285b724c8e52d6dad4bb13e9caa WatchSource:0}: Error finding container d31c1c8fb38c3a2fb491c377fe3c5fcdb341c285b724c8e52d6dad4bb13e9caa: Status 404 returned error can't find the container with id d31c1c8fb38c3a2fb491c377fe3c5fcdb341c285b724c8e52d6dad4bb13e9caa Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.537263 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" event={"ID":"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04","Type":"ContainerStarted","Data":"11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7"} Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.537389 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerName="dnsmasq-dns" containerID="cri-o://11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7" gracePeriod=10 Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.537424 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.542962 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"b3c7c69c-4fd9-4483-89b7-202f766ce6e5","Type":"ContainerStarted","Data":"1689a5bbd74c7b03ca7f67dc60c62c65c50b6d7ba6417672abe10ea66a445943"} Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.543321 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.544449 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mjcrr" event={"ID":"0cdf429e-b93d-4009-aaa1-1c45a0083363","Type":"ContainerStarted","Data":"01f39b08cdf3a6cf2e81bccaa84356210feae986bffaf2896cd2abacce82858d"} Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.563208 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" event={"ID":"412c80a3-bf18-4eef-a1f6-440b969e8b28","Type":"ContainerStarted","Data":"d31c1c8fb38c3a2fb491c377fe3c5fcdb341c285b724c8e52d6dad4bb13e9caa"} Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.576692 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" podStartSLOduration=12.038844203 podStartE2EDuration="32.576680241s" podCreationTimestamp="2025-12-02 08:02:22 +0000 UTC" firstStartedPulling="2025-12-02 08:02:23.229061354 +0000 UTC m=+991.013140216" lastFinishedPulling="2025-12-02 08:02:43.766897392 +0000 UTC m=+1011.550976254" observedRunningTime="2025-12-02 08:02:54.575195183 +0000 UTC m=+1022.359274045" watchObservedRunningTime="2025-12-02 08:02:54.576680241 +0000 UTC m=+1022.360759103" Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.583544 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" event={"ID":"0f9bb2b1-116d-48e6-a123-ac026f53023e","Type":"ContainerStarted","Data":"34a6cdc32ca985bc2fc9634a8caf2a48d1f4da54f318f439a0269deb514f3f4e"} Dec 02 08:02:54 crc kubenswrapper[4691]: I1202 08:02:54.603956 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.844414474 podStartE2EDuration="28.603913646s" podCreationTimestamp="2025-12-02 08:02:26 +0000 UTC" firstStartedPulling="2025-12-02 08:02:43.611059101 +0000 UTC m=+1011.395137963" lastFinishedPulling="2025-12-02 08:02:52.370558273 +0000 UTC m=+1020.154637135" observedRunningTime="2025-12-02 08:02:54.600352956 +0000 UTC m=+1022.384431848" watchObservedRunningTime="2025-12-02 08:02:54.603913646 +0000 UTC m=+1022.387992508" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.182074 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.236337 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4zb4\" (UniqueName: \"kubernetes.io/projected/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-kube-api-access-l4zb4\") pod \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.236455 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-dns-svc\") pod \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.236680 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-config\") pod \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\" (UID: \"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04\") " Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.259225 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-kube-api-access-l4zb4" (OuterVolumeSpecName: "kube-api-access-l4zb4") pod "fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" (UID: "fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04"). InnerVolumeSpecName "kube-api-access-l4zb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.339380 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4zb4\" (UniqueName: \"kubernetes.io/projected/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-kube-api-access-l4zb4\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.570809 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" (UID: "fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.575494 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-config" (OuterVolumeSpecName: "config") pod "fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" (UID: "fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.577337 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jwww" event={"ID":"ed3b8ff0-c19d-4614-abe4-0ad6b5801b78","Type":"ContainerStarted","Data":"5bf19771717752fac6d7dda2107feb219ddde788be950d30f9fece4464b7c64d"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.578926 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-9jwww" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.587707 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9dfe73d8-e7c6-4906-bb6c-64c13435c53f","Type":"ContainerStarted","Data":"212fd247b1340b52e4dde29be09b1597632b28e33d36005b1cdabfa8273c2afb"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.596803 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1473d859-fd02-490b-a906-bf8136cb422c","Type":"ContainerStarted","Data":"c1185d2699dd44b7387024287e734dcd624c4b4e4985e16c8bdafa6f889983b9"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.596880 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.604911 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aa4f9395-a46a-40e4-a80c-c9b43caadc0b","Type":"ContainerStarted","Data":"d9166627578df71365ac570a0ed8f7c5295f6f261adf2ef2635300d08400de2f"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.607276 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" event={"ID":"412c80a3-bf18-4eef-a1f6-440b969e8b28","Type":"ContainerStarted","Data":"70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.610233 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-9jwww" podStartSLOduration=14.793943448 podStartE2EDuration="23.610210909s" podCreationTimestamp="2025-12-02 08:02:32 +0000 UTC" firstStartedPulling="2025-12-02 08:02:45.186923293 +0000 UTC m=+1012.971002155" lastFinishedPulling="2025-12-02 08:02:54.003190754 +0000 UTC m=+1021.787269616" observedRunningTime="2025-12-02 08:02:55.595373056 +0000 UTC m=+1023.379451918" watchObservedRunningTime="2025-12-02 08:02:55.610210909 +0000 UTC m=+1023.394289801" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.611280 4691 generic.go:334] "Generic (PLEG): container finished" podID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerID="202cb739a749715b596f060a1b5def0a4a6ca9a34d76beee7efb38c01eb8c6a0" exitCode=0 Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.611340 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" event={"ID":"0f9bb2b1-116d-48e6-a123-ac026f53023e","Type":"ContainerDied","Data":"202cb739a749715b596f060a1b5def0a4a6ca9a34d76beee7efb38c01eb8c6a0"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.616483 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-twpkf" event={"ID":"41e5a6f8-bc4a-43d6-b49b-d065f6cef159","Type":"ContainerStarted","Data":"da6d0577ac299a961959f96b457ee58c31f6f47d47262c9dfa994f4b5bf3ef07"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.618831 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f14bc2d4-ce0c-440d-9e1d-15b0b8716562","Type":"ContainerStarted","Data":"bbdf402951fbe1c59ffc81bd5943bb173d8fffabaeacfc3ecb78ba48e945bd3d"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.621308 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4edb0266-7a9e-4e28-810c-7136d8336f1b","Type":"ContainerStarted","Data":"dd2061ca7f95795ed75d4f5fbca7f51f1a141eb8a3b587ee24e209aca881256b"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.622960 4691 generic.go:334] "Generic (PLEG): container finished" podID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerID="11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7" exitCode=0 Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.623418 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.633602 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" event={"ID":"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04","Type":"ContainerDied","Data":"11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.633662 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-x75lc" event={"ID":"fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04","Type":"ContainerDied","Data":"a0b30c9ef3b71fbd78758ca4f2c3c70d701cc6bf937126d042ae5d0fef296abc"} Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.633687 4691 scope.go:117] "RemoveContainer" containerID="11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.659132 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.659213 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:02:55 crc kubenswrapper[4691]: I1202 08:02:55.725916 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.65471721 podStartE2EDuration="27.725893639s" podCreationTimestamp="2025-12-02 08:02:28 +0000 UTC" firstStartedPulling="2025-12-02 08:02:44.839290488 +0000 UTC m=+1012.623369360" lastFinishedPulling="2025-12-02 08:02:54.910466927 +0000 UTC m=+1022.694545789" observedRunningTime="2025-12-02 08:02:55.7223189 +0000 UTC m=+1023.506397762" watchObservedRunningTime="2025-12-02 08:02:55.725893639 +0000 UTC m=+1023.509972511" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:55.908452 4691 scope.go:117] "RemoveContainer" containerID="8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:55.914865 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x75lc"] Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:55.920407 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-x75lc"] Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.330444 4691 scope.go:117] "RemoveContainer" containerID="11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7" Dec 02 08:02:56 crc kubenswrapper[4691]: E1202 08:02:56.333578 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7\": container with ID starting with 11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7 not found: ID does not exist" containerID="11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.333620 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7"} err="failed to get container status \"11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7\": rpc error: code = NotFound desc = could not find container \"11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7\": container with ID starting with 11809f5bb8d135ec63ede889149181fa5e83a1d6b6d38e5f014c7a0d570113b7 not found: ID does not exist" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.333649 4691 scope.go:117] "RemoveContainer" containerID="8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da" Dec 02 08:02:56 crc kubenswrapper[4691]: E1202 08:02:56.336098 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da\": container with ID starting with 8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da not found: ID does not exist" containerID="8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.336133 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da"} err="failed to get container status \"8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da\": rpc error: code = NotFound desc = could not find container \"8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da\": container with ID starting with 8410045bf43455f825b15884a68165a20b16791e110b38b20a2c1a7a2ca536da not found: ID does not exist" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.572563 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" path="/var/lib/kubelet/pods/fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04/volumes" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.635021 4691 generic.go:334] "Generic (PLEG): container finished" podID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerID="70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65" exitCode=0 Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.635108 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" event={"ID":"412c80a3-bf18-4eef-a1f6-440b969e8b28","Type":"ContainerDied","Data":"70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65"} Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.635142 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" event={"ID":"412c80a3-bf18-4eef-a1f6-440b969e8b28","Type":"ContainerStarted","Data":"b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a"} Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.635184 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.638641 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" event={"ID":"0f9bb2b1-116d-48e6-a123-ac026f53023e","Type":"ContainerStarted","Data":"b20ef335decaa9bb0627629b30b7b182e50a474d52d53de690c6a550968c5d84"} Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.638778 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.641938 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ed4ad29-5963-47aa-ba01-faf16686c61d","Type":"ContainerStarted","Data":"f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925"} Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.647724 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ce7acb7-a140-4c78-a71d-d3c96aa12651","Type":"ContainerStarted","Data":"03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8"} Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.655216 4691 generic.go:334] "Generic (PLEG): container finished" podID="41e5a6f8-bc4a-43d6-b49b-d065f6cef159" containerID="da6d0577ac299a961959f96b457ee58c31f6f47d47262c9dfa994f4b5bf3ef07" exitCode=0 Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.655291 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-twpkf" event={"ID":"41e5a6f8-bc4a-43d6-b49b-d065f6cef159","Type":"ContainerDied","Data":"da6d0577ac299a961959f96b457ee58c31f6f47d47262c9dfa994f4b5bf3ef07"} Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.666437 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" podStartSLOduration=11.666414129 podStartE2EDuration="11.666414129s" podCreationTimestamp="2025-12-02 08:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:02:56.651966946 +0000 UTC m=+1024.436045798" watchObservedRunningTime="2025-12-02 08:02:56.666414129 +0000 UTC m=+1024.450492991" Dec 02 08:02:56 crc kubenswrapper[4691]: I1202 08:02:56.678979 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" podStartSLOduration=11.678964135 podStartE2EDuration="11.678964135s" podCreationTimestamp="2025-12-02 08:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:02:56.674098903 +0000 UTC m=+1024.458177755" watchObservedRunningTime="2025-12-02 08:02:56.678964135 +0000 UTC m=+1024.463042997" Dec 02 08:02:57 crc kubenswrapper[4691]: I1202 08:02:57.792795 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-twpkf" event={"ID":"41e5a6f8-bc4a-43d6-b49b-d065f6cef159","Type":"ContainerStarted","Data":"6cc2ad322e0ef174a9de34ab997e285d528675547623052326e6bba047a98c47"} Dec 02 08:02:57 crc kubenswrapper[4691]: I1202 08:02:57.793410 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-twpkf" event={"ID":"41e5a6f8-bc4a-43d6-b49b-d065f6cef159","Type":"ContainerStarted","Data":"3458d68bb915dedf50215878cb0a6aa6a5066bf3f7e5ef3e10522e3475e6cc23"} Dec 02 08:02:57 crc kubenswrapper[4691]: I1202 08:02:57.797804 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:57 crc kubenswrapper[4691]: I1202 08:02:57.797853 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:02:57 crc kubenswrapper[4691]: I1202 08:02:57.835675 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-twpkf" podStartSLOduration=17.082474048999998 podStartE2EDuration="25.835652693s" podCreationTimestamp="2025-12-02 08:02:32 +0000 UTC" firstStartedPulling="2025-12-02 08:02:45.285538484 +0000 UTC m=+1013.069617346" lastFinishedPulling="2025-12-02 08:02:54.038717128 +0000 UTC m=+1021.822795990" observedRunningTime="2025-12-02 08:02:57.832435632 +0000 UTC m=+1025.616514514" watchObservedRunningTime="2025-12-02 08:02:57.835652693 +0000 UTC m=+1025.619731555" Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.815410 4691 generic.go:334] "Generic (PLEG): container finished" podID="f14bc2d4-ce0c-440d-9e1d-15b0b8716562" containerID="bbdf402951fbe1c59ffc81bd5943bb173d8fffabaeacfc3ecb78ba48e945bd3d" exitCode=0 Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.815490 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f14bc2d4-ce0c-440d-9e1d-15b0b8716562","Type":"ContainerDied","Data":"bbdf402951fbe1c59ffc81bd5943bb173d8fffabaeacfc3ecb78ba48e945bd3d"} Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.822257 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-mjcrr" event={"ID":"0cdf429e-b93d-4009-aaa1-1c45a0083363","Type":"ContainerStarted","Data":"256db77fa28d6d5cfbc5d25280784d92d26255bcd64ee78ccbde45f7033bfd54"} Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.825561 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4edb0266-7a9e-4e28-810c-7136d8336f1b","Type":"ContainerStarted","Data":"0949f38b732ed7ed8e420ac5523491411ff02185c465e63a8fbb83f434332845"} Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.827671 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"9dfe73d8-e7c6-4906-bb6c-64c13435c53f","Type":"ContainerStarted","Data":"7079edd6690d76dbb6316a6b5880532e4447df2629159d20af0e82fdd88fdfe5"} Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.830155 4691 generic.go:334] "Generic (PLEG): container finished" podID="aa4f9395-a46a-40e4-a80c-c9b43caadc0b" containerID="d9166627578df71365ac570a0ed8f7c5295f6f261adf2ef2635300d08400de2f" exitCode=0 Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.830218 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aa4f9395-a46a-40e4-a80c-c9b43caadc0b","Type":"ContainerDied","Data":"d9166627578df71365ac570a0ed8f7c5295f6f261adf2ef2635300d08400de2f"} Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.903132 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=13.230321166 podStartE2EDuration="28.903096427s" podCreationTimestamp="2025-12-02 08:02:32 +0000 UTC" firstStartedPulling="2025-12-02 08:02:44.637874782 +0000 UTC m=+1012.421953644" lastFinishedPulling="2025-12-02 08:03:00.310650043 +0000 UTC m=+1028.094728905" observedRunningTime="2025-12-02 08:03:00.899129917 +0000 UTC m=+1028.683208789" watchObservedRunningTime="2025-12-02 08:03:00.903096427 +0000 UTC m=+1028.687175289" Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.931365 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=9.776932325 podStartE2EDuration="24.931346567s" podCreationTimestamp="2025-12-02 08:02:36 +0000 UTC" firstStartedPulling="2025-12-02 08:02:45.178948792 +0000 UTC m=+1012.963027654" lastFinishedPulling="2025-12-02 08:03:00.333363024 +0000 UTC m=+1028.117441896" observedRunningTime="2025-12-02 08:03:00.92669319 +0000 UTC m=+1028.710772052" watchObservedRunningTime="2025-12-02 08:03:00.931346567 +0000 UTC m=+1028.715425429" Dec 02 08:03:00 crc kubenswrapper[4691]: I1202 08:03:00.951175 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-mjcrr" podStartSLOduration=9.34766524 podStartE2EDuration="15.951158616s" podCreationTimestamp="2025-12-02 08:02:45 +0000 UTC" firstStartedPulling="2025-12-02 08:02:53.745353828 +0000 UTC m=+1021.529432690" lastFinishedPulling="2025-12-02 08:03:00.348847204 +0000 UTC m=+1028.132926066" observedRunningTime="2025-12-02 08:03:00.94892881 +0000 UTC m=+1028.733007672" watchObservedRunningTime="2025-12-02 08:03:00.951158616 +0000 UTC m=+1028.735237468" Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.244096 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.308030 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.327090 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.490862 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m866m"] Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.495836 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.508289 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 02 08:03:01 crc kubenswrapper[4691]: I1202 08:03:01.582145 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.159578 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"aa4f9395-a46a-40e4-a80c-c9b43caadc0b","Type":"ContainerStarted","Data":"0d11ca65ace3614bba995006e6cc0f48a23368e5fe08017089e511d1a3e0e190"} Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.178251 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerName="dnsmasq-dns" containerID="cri-o://b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a" gracePeriod=10 Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.178602 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"f14bc2d4-ce0c-440d-9e1d-15b0b8716562","Type":"ContainerStarted","Data":"bf75656b59ef2890fe124e3fbb77d684fa757f23a29bfd08ea65e9796aca8074"} Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.178669 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.178820 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.186917 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.224093205 podStartE2EDuration="38.186894292s" podCreationTimestamp="2025-12-02 08:02:24 +0000 UTC" firstStartedPulling="2025-12-02 08:02:44.888359812 +0000 UTC m=+1012.672438674" lastFinishedPulling="2025-12-02 08:02:53.851160899 +0000 UTC m=+1021.635239761" observedRunningTime="2025-12-02 08:03:02.180476631 +0000 UTC m=+1029.964555493" watchObservedRunningTime="2025-12-02 08:03:02.186894292 +0000 UTC m=+1029.970973154" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.210730 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=28.060709605 podStartE2EDuration="37.210710901s" podCreationTimestamp="2025-12-02 08:02:25 +0000 UTC" firstStartedPulling="2025-12-02 08:02:44.853206168 +0000 UTC m=+1012.637285030" lastFinishedPulling="2025-12-02 08:02:54.003207464 +0000 UTC m=+1021.787286326" observedRunningTime="2025-12-02 08:03:02.209408898 +0000 UTC m=+1029.993487770" watchObservedRunningTime="2025-12-02 08:03:02.210710901 +0000 UTC m=+1029.994789763" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.235174 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.236718 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.305999 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.783298 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.859808 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 02 08:03:02 crc kubenswrapper[4691]: E1202 08:03:02.860362 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860394 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: E1202 08:03:02.860419 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerName="init" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860433 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerName="init" Dec 02 08:03:02 crc kubenswrapper[4691]: E1202 08:03:02.860483 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerName="init" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860503 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerName="init" Dec 02 08:03:02 crc kubenswrapper[4691]: E1202 08:03:02.860532 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860542 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: E1202 08:03:02.860566 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerName="init" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860580 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerName="init" Dec 02 08:03:02 crc kubenswrapper[4691]: E1202 08:03:02.860600 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860612 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860919 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="257ff17d-9e9e-4bda-a93b-8f419bd07ef3" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860953 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa5bdbdd-dfe9-4d04-a8d1-2288e58d4f04" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.860977 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerName="dnsmasq-dns" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.862434 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.875751 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.875850 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.875751 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-d6lrh" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.876078 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.901974 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.962833 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-ovsdbserver-sb\") pod \"412c80a3-bf18-4eef-a1f6-440b969e8b28\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963154 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-dns-svc\") pod \"412c80a3-bf18-4eef-a1f6-440b969e8b28\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963239 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74jrj\" (UniqueName: \"kubernetes.io/projected/412c80a3-bf18-4eef-a1f6-440b969e8b28-kube-api-access-74jrj\") pod \"412c80a3-bf18-4eef-a1f6-440b969e8b28\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963291 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-config\") pod \"412c80a3-bf18-4eef-a1f6-440b969e8b28\" (UID: \"412c80a3-bf18-4eef-a1f6-440b969e8b28\") " Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963812 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963864 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kdmn\" (UniqueName: \"kubernetes.io/projected/23291b10-f1ed-4d19-9689-62bdf530e28e-kube-api-access-6kdmn\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963930 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23291b10-f1ed-4d19-9689-62bdf530e28e-scripts\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.963971 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23291b10-f1ed-4d19-9689-62bdf530e28e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.964021 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.964149 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23291b10-f1ed-4d19-9689-62bdf530e28e-config\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:02 crc kubenswrapper[4691]: I1202 08:03:02.971130 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/412c80a3-bf18-4eef-a1f6-440b969e8b28-kube-api-access-74jrj" (OuterVolumeSpecName: "kube-api-access-74jrj") pod "412c80a3-bf18-4eef-a1f6-440b969e8b28" (UID: "412c80a3-bf18-4eef-a1f6-440b969e8b28"). InnerVolumeSpecName "kube-api-access-74jrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.020056 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "412c80a3-bf18-4eef-a1f6-440b969e8b28" (UID: "412c80a3-bf18-4eef-a1f6-440b969e8b28"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.021738 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "412c80a3-bf18-4eef-a1f6-440b969e8b28" (UID: "412c80a3-bf18-4eef-a1f6-440b969e8b28"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.039588 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-config" (OuterVolumeSpecName: "config") pod "412c80a3-bf18-4eef-a1f6-440b969e8b28" (UID: "412c80a3-bf18-4eef-a1f6-440b969e8b28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.065878 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23291b10-f1ed-4d19-9689-62bdf530e28e-config\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.065943 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066005 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066030 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kdmn\" (UniqueName: \"kubernetes.io/projected/23291b10-f1ed-4d19-9689-62bdf530e28e-kube-api-access-6kdmn\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066061 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23291b10-f1ed-4d19-9689-62bdf530e28e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066095 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23291b10-f1ed-4d19-9689-62bdf530e28e-scripts\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066144 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066236 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066250 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74jrj\" (UniqueName: \"kubernetes.io/projected/412c80a3-bf18-4eef-a1f6-440b969e8b28-kube-api-access-74jrj\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066261 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.066271 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/412c80a3-bf18-4eef-a1f6-440b969e8b28-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.067770 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/23291b10-f1ed-4d19-9689-62bdf530e28e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.068647 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/23291b10-f1ed-4d19-9689-62bdf530e28e-scripts\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.070931 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.071307 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.071636 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23291b10-f1ed-4d19-9689-62bdf530e28e-config\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.079710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/23291b10-f1ed-4d19-9689-62bdf530e28e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.092409 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kdmn\" (UniqueName: \"kubernetes.io/projected/23291b10-f1ed-4d19-9689-62bdf530e28e-kube-api-access-6kdmn\") pod \"ovn-northd-0\" (UID: \"23291b10-f1ed-4d19-9689-62bdf530e28e\") " pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.188538 4691 generic.go:334] "Generic (PLEG): container finished" podID="412c80a3-bf18-4eef-a1f6-440b969e8b28" containerID="b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a" exitCode=0 Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.189493 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.200551 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" event={"ID":"412c80a3-bf18-4eef-a1f6-440b969e8b28","Type":"ContainerDied","Data":"b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a"} Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.200597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-m866m" event={"ID":"412c80a3-bf18-4eef-a1f6-440b969e8b28","Type":"ContainerDied","Data":"d31c1c8fb38c3a2fb491c377fe3c5fcdb341c285b724c8e52d6dad4bb13e9caa"} Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.200618 4691 scope.go:117] "RemoveContainer" containerID="b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.202197 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.230001 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m866m"] Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.233142 4691 scope.go:117] "RemoveContainer" containerID="70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.235703 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-m866m"] Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.255517 4691 scope.go:117] "RemoveContainer" containerID="b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a" Dec 02 08:03:03 crc kubenswrapper[4691]: E1202 08:03:03.255810 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a\": container with ID starting with b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a not found: ID does not exist" containerID="b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.255841 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a"} err="failed to get container status \"b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a\": rpc error: code = NotFound desc = could not find container \"b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a\": container with ID starting with b25b936e0c65791eba0b25120216f70aa95598a319badb5fa681b400a9bc389a not found: ID does not exist" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.255862 4691 scope.go:117] "RemoveContainer" containerID="70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65" Dec 02 08:03:03 crc kubenswrapper[4691]: E1202 08:03:03.256471 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65\": container with ID starting with 70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65 not found: ID does not exist" containerID="70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.256495 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65"} err="failed to get container status \"70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65\": rpc error: code = NotFound desc = could not find container \"70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65\": container with ID starting with 70881c3397b22c1178500209617683d1a873a7a3b555e52d8288c035d8857e65 not found: ID does not exist" Dec 02 08:03:03 crc kubenswrapper[4691]: I1202 08:03:03.675278 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 02 08:03:03 crc kubenswrapper[4691]: W1202 08:03:03.683189 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod23291b10_f1ed_4d19_9689_62bdf530e28e.slice/crio-15487ed6e9706a6fc39beb1d8e92ecf1f0d0e110ce4ecf1c4723a62807a61c79 WatchSource:0}: Error finding container 15487ed6e9706a6fc39beb1d8e92ecf1f0d0e110ce4ecf1c4723a62807a61c79: Status 404 returned error can't find the container with id 15487ed6e9706a6fc39beb1d8e92ecf1f0d0e110ce4ecf1c4723a62807a61c79 Dec 02 08:03:04 crc kubenswrapper[4691]: I1202 08:03:04.199247 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23291b10-f1ed-4d19-9689-62bdf530e28e","Type":"ContainerStarted","Data":"15487ed6e9706a6fc39beb1d8e92ecf1f0d0e110ce4ecf1c4723a62807a61c79"} Dec 02 08:03:04 crc kubenswrapper[4691]: I1202 08:03:04.572074 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="412c80a3-bf18-4eef-a1f6-440b969e8b28" path="/var/lib/kubelet/pods/412c80a3-bf18-4eef-a1f6-440b969e8b28/volumes" Dec 02 08:03:05 crc kubenswrapper[4691]: I1202 08:03:05.503963 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 02 08:03:05 crc kubenswrapper[4691]: I1202 08:03:05.504630 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 02 08:03:06 crc kubenswrapper[4691]: I1202 08:03:06.221421 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23291b10-f1ed-4d19-9689-62bdf530e28e","Type":"ContainerStarted","Data":"0dc3fbfacd04bf662bbf86127ca69ad26293f11fe26efb7396c6484ca704f736"} Dec 02 08:03:06 crc kubenswrapper[4691]: I1202 08:03:06.221479 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"23291b10-f1ed-4d19-9689-62bdf530e28e","Type":"ContainerStarted","Data":"7e32fb507015ac9275b3c7f681703df8933227a0abf49b8e42d1943773f46750"} Dec 02 08:03:06 crc kubenswrapper[4691]: I1202 08:03:06.222419 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 02 08:03:06 crc kubenswrapper[4691]: I1202 08:03:06.255531 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.736324976 podStartE2EDuration="4.255512082s" podCreationTimestamp="2025-12-02 08:03:02 +0000 UTC" firstStartedPulling="2025-12-02 08:03:03.685921511 +0000 UTC m=+1031.470000373" lastFinishedPulling="2025-12-02 08:03:05.205108617 +0000 UTC m=+1032.989187479" observedRunningTime="2025-12-02 08:03:06.253884911 +0000 UTC m=+1034.037963793" watchObservedRunningTime="2025-12-02 08:03:06.255512082 +0000 UTC m=+1034.039590944" Dec 02 08:03:06 crc kubenswrapper[4691]: I1202 08:03:06.982671 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 02 08:03:06 crc kubenswrapper[4691]: I1202 08:03:06.984011 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.013145 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.221281 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-mvp4b"] Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.222950 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.239593 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mvp4b"] Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.298125 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.304267 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.304334 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nttlf\" (UniqueName: \"kubernetes.io/projected/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-kube-api-access-nttlf\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.304377 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.304409 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-config\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.304446 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-dns-svc\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.406053 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-dns-svc\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.407405 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.407500 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nttlf\" (UniqueName: \"kubernetes.io/projected/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-kube-api-access-nttlf\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.407572 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.407652 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-config\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.407245 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-dns-svc\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.410032 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.410326 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.412460 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-config\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.433358 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.448532 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nttlf\" (UniqueName: \"kubernetes.io/projected/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-kube-api-access-nttlf\") pod \"dnsmasq-dns-698758b865-mvp4b\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:09 crc kubenswrapper[4691]: I1202 08:03:09.543100 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.360408 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.366036 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.368830 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.368851 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-p72jz" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.368903 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.377779 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.384310 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.417985 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mvp4b"] Dec 02 08:03:10 crc kubenswrapper[4691]: W1202 08:03:10.437339 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5f602b9_3363_49e4_b0c0_b0c520b5feaf.slice/crio-316af0189478220fc980d86c0262bf1ae73630f9fc3a9284b680f3780ef0be02 WatchSource:0}: Error finding container 316af0189478220fc980d86c0262bf1ae73630f9fc3a9284b680f3780ef0be02: Status 404 returned error can't find the container with id 316af0189478220fc980d86c0262bf1ae73630f9fc3a9284b680f3780ef0be02 Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.542736 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8gpnf\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-kube-api-access-8gpnf\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.543105 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.543150 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-lock\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.543170 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.543229 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-cache\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.644833 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-cache\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.644913 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8gpnf\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-kube-api-access-8gpnf\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.644944 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.644999 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-lock\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.645017 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: E1202 08:03:10.645159 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 08:03:10 crc kubenswrapper[4691]: E1202 08:03:10.645175 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 08:03:10 crc kubenswrapper[4691]: E1202 08:03:10.645218 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift podName:c25f8b81-a8e1-4035-ae92-209fd4ed5ec0 nodeName:}" failed. No retries permitted until 2025-12-02 08:03:11.145202627 +0000 UTC m=+1038.929281489 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift") pod "swift-storage-0" (UID: "c25f8b81-a8e1-4035-ae92-209fd4ed5ec0") : configmap "swift-ring-files" not found Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.645856 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.646414 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-lock\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.646433 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-cache\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.665013 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8gpnf\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-kube-api-access-8gpnf\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:10 crc kubenswrapper[4691]: I1202 08:03:10.670492 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:11 crc kubenswrapper[4691]: I1202 08:03:11.154120 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:11 crc kubenswrapper[4691]: E1202 08:03:11.154309 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 08:03:11 crc kubenswrapper[4691]: E1202 08:03:11.154341 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 08:03:11 crc kubenswrapper[4691]: E1202 08:03:11.154394 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift podName:c25f8b81-a8e1-4035-ae92-209fd4ed5ec0 nodeName:}" failed. No retries permitted until 2025-12-02 08:03:12.154380256 +0000 UTC m=+1039.938459118 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift") pod "swift-storage-0" (UID: "c25f8b81-a8e1-4035-ae92-209fd4ed5ec0") : configmap "swift-ring-files" not found Dec 02 08:03:11 crc kubenswrapper[4691]: I1202 08:03:11.316064 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mvp4b" event={"ID":"d5f602b9-3363-49e4-b0c0-b0c520b5feaf","Type":"ContainerStarted","Data":"316af0189478220fc980d86c0262bf1ae73630f9fc3a9284b680f3780ef0be02"} Dec 02 08:03:12 crc kubenswrapper[4691]: I1202 08:03:12.198597 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:12 crc kubenswrapper[4691]: E1202 08:03:12.198837 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 08:03:12 crc kubenswrapper[4691]: E1202 08:03:12.199035 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 08:03:12 crc kubenswrapper[4691]: E1202 08:03:12.199094 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift podName:c25f8b81-a8e1-4035-ae92-209fd4ed5ec0 nodeName:}" failed. No retries permitted until 2025-12-02 08:03:14.199076207 +0000 UTC m=+1041.983155069 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift") pod "swift-storage-0" (UID: "c25f8b81-a8e1-4035-ae92-209fd4ed5ec0") : configmap "swift-ring-files" not found Dec 02 08:03:12 crc kubenswrapper[4691]: I1202 08:03:12.326497 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mvp4b" event={"ID":"d5f602b9-3363-49e4-b0c0-b0c520b5feaf","Type":"ContainerStarted","Data":"8e9f05105658aca9613058a5fe7c5d10f9a7f47e168d586e156636a40650c54d"} Dec 02 08:03:13 crc kubenswrapper[4691]: I1202 08:03:13.337166 4691 generic.go:334] "Generic (PLEG): container finished" podID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerID="8e9f05105658aca9613058a5fe7c5d10f9a7f47e168d586e156636a40650c54d" exitCode=0 Dec 02 08:03:13 crc kubenswrapper[4691]: I1202 08:03:13.337213 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mvp4b" event={"ID":"d5f602b9-3363-49e4-b0c0-b0c520b5feaf","Type":"ContainerDied","Data":"8e9f05105658aca9613058a5fe7c5d10f9a7f47e168d586e156636a40650c54d"} Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.239423 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:14 crc kubenswrapper[4691]: E1202 08:03:14.239608 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 08:03:14 crc kubenswrapper[4691]: E1202 08:03:14.239810 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 08:03:14 crc kubenswrapper[4691]: E1202 08:03:14.239864 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift podName:c25f8b81-a8e1-4035-ae92-209fd4ed5ec0 nodeName:}" failed. No retries permitted until 2025-12-02 08:03:18.239848834 +0000 UTC m=+1046.023927696 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift") pod "swift-storage-0" (UID: "c25f8b81-a8e1-4035-ae92-209fd4ed5ec0") : configmap "swift-ring-files" not found Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.339583 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-lchdp"] Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.340973 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.343457 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.348646 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.349120 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.355153 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mvp4b" event={"ID":"d5f602b9-3363-49e4-b0c0-b0c520b5feaf","Type":"ContainerStarted","Data":"bd76f007863257f002eb243891e20aa281636a5729ebd092c94791d8128966b4"} Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.355339 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lchdp"] Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.355504 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.388238 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-mvp4b" podStartSLOduration=5.388214746 podStartE2EDuration="5.388214746s" podCreationTimestamp="2025-12-02 08:03:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:03:14.379563518 +0000 UTC m=+1042.163642400" watchObservedRunningTime="2025-12-02 08:03:14.388214746 +0000 UTC m=+1042.172293608" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-ring-data-devices\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442476 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-scripts\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442514 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-dispersionconf\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442548 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-278xl\" (UniqueName: \"kubernetes.io/projected/82672deb-2527-4d18-8006-0f794dfe97c0-kube-api-access-278xl\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442615 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82672deb-2527-4d18-8006-0f794dfe97c0-etc-swift\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442632 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-combined-ca-bundle\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.442654 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-swiftconf\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.543905 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82672deb-2527-4d18-8006-0f794dfe97c0-etc-swift\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.544296 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-combined-ca-bundle\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.544402 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-swiftconf\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.544526 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-ring-data-devices\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.544631 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-scripts\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.544555 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82672deb-2527-4d18-8006-0f794dfe97c0-etc-swift\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.545008 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-dispersionconf\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.545100 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-278xl\" (UniqueName: \"kubernetes.io/projected/82672deb-2527-4d18-8006-0f794dfe97c0-kube-api-access-278xl\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.545389 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-ring-data-devices\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.545659 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-scripts\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.549059 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-combined-ca-bundle\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.550678 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-dispersionconf\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.552726 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-swiftconf\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.562393 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-278xl\" (UniqueName: \"kubernetes.io/projected/82672deb-2527-4d18-8006-0f794dfe97c0-kube-api-access-278xl\") pod \"swift-ring-rebalance-lchdp\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:14 crc kubenswrapper[4691]: I1202 08:03:14.723885 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:15 crc kubenswrapper[4691]: I1202 08:03:15.102547 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 02 08:03:15 crc kubenswrapper[4691]: I1202 08:03:15.202568 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 02 08:03:15 crc kubenswrapper[4691]: I1202 08:03:15.330082 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-lchdp"] Dec 02 08:03:15 crc kubenswrapper[4691]: W1202 08:03:15.335670 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82672deb_2527_4d18_8006_0f794dfe97c0.slice/crio-e6a945c41f18e1c30c623d6a77e9c5cedc22e4037b22f5db1553811c1c580915 WatchSource:0}: Error finding container e6a945c41f18e1c30c623d6a77e9c5cedc22e4037b22f5db1553811c1c580915: Status 404 returned error can't find the container with id e6a945c41f18e1c30c623d6a77e9c5cedc22e4037b22f5db1553811c1c580915 Dec 02 08:03:15 crc kubenswrapper[4691]: I1202 08:03:15.364536 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lchdp" event={"ID":"82672deb-2527-4d18-8006-0f794dfe97c0","Type":"ContainerStarted","Data":"e6a945c41f18e1c30c623d6a77e9c5cedc22e4037b22f5db1553811c1c580915"} Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.007703 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-dec3-account-create-update-ncxgc"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.009836 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.012368 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.020850 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dec3-account-create-update-ncxgc"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.102857 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-zwd88"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.104122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4821e81-a183-48bc-b558-9e1b64ad63c7-operator-scripts\") pod \"keystone-dec3-account-create-update-ncxgc\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.104246 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s854k\" (UniqueName: \"kubernetes.io/projected/d4821e81-a183-48bc-b558-9e1b64ad63c7-kube-api-access-s854k\") pod \"keystone-dec3-account-create-update-ncxgc\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.104963 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.115449 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zwd88"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.200656 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rzknp"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.203245 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.204793 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbh9g\" (UniqueName: \"kubernetes.io/projected/db56f732-e81f-48e9-8b33-cc147b6ffc99-kube-api-access-jbh9g\") pod \"keystone-db-create-zwd88\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.204838 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31da07a4-df34-4c7e-91a1-9847f83ed7fd-operator-scripts\") pod \"placement-db-create-rzknp\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.204865 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dckmk\" (UniqueName: \"kubernetes.io/projected/31da07a4-df34-4c7e-91a1-9847f83ed7fd-kube-api-access-dckmk\") pod \"placement-db-create-rzknp\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.204945 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db56f732-e81f-48e9-8b33-cc147b6ffc99-operator-scripts\") pod \"keystone-db-create-zwd88\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.204980 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s854k\" (UniqueName: \"kubernetes.io/projected/d4821e81-a183-48bc-b558-9e1b64ad63c7-kube-api-access-s854k\") pod \"keystone-dec3-account-create-update-ncxgc\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.205105 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4821e81-a183-48bc-b558-9e1b64ad63c7-operator-scripts\") pod \"keystone-dec3-account-create-update-ncxgc\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.205963 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4821e81-a183-48bc-b558-9e1b64ad63c7-operator-scripts\") pod \"keystone-dec3-account-create-update-ncxgc\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.214925 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-ff84-account-create-update-7z425"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.216427 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.220112 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.234909 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rzknp"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.241130 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s854k\" (UniqueName: \"kubernetes.io/projected/d4821e81-a183-48bc-b558-9e1b64ad63c7-kube-api-access-s854k\") pod \"keystone-dec3-account-create-update-ncxgc\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.246570 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ff84-account-create-update-7z425"] Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.306708 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db56f732-e81f-48e9-8b33-cc147b6ffc99-operator-scripts\") pod \"keystone-db-create-zwd88\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.306831 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f033101-48ac-4bad-9b29-3547a7c6b721-operator-scripts\") pod \"placement-ff84-account-create-update-7z425\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.306884 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbh9g\" (UniqueName: \"kubernetes.io/projected/db56f732-e81f-48e9-8b33-cc147b6ffc99-kube-api-access-jbh9g\") pod \"keystone-db-create-zwd88\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.306905 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31da07a4-df34-4c7e-91a1-9847f83ed7fd-operator-scripts\") pod \"placement-db-create-rzknp\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.306928 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dckmk\" (UniqueName: \"kubernetes.io/projected/31da07a4-df34-4c7e-91a1-9847f83ed7fd-kube-api-access-dckmk\") pod \"placement-db-create-rzknp\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.306953 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn89x\" (UniqueName: \"kubernetes.io/projected/3f033101-48ac-4bad-9b29-3547a7c6b721-kube-api-access-bn89x\") pod \"placement-ff84-account-create-update-7z425\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.307540 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db56f732-e81f-48e9-8b33-cc147b6ffc99-operator-scripts\") pod \"keystone-db-create-zwd88\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.307622 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31da07a4-df34-4c7e-91a1-9847f83ed7fd-operator-scripts\") pod \"placement-db-create-rzknp\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.326637 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dckmk\" (UniqueName: \"kubernetes.io/projected/31da07a4-df34-4c7e-91a1-9847f83ed7fd-kube-api-access-dckmk\") pod \"placement-db-create-rzknp\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.329167 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbh9g\" (UniqueName: \"kubernetes.io/projected/db56f732-e81f-48e9-8b33-cc147b6ffc99-kube-api-access-jbh9g\") pod \"keystone-db-create-zwd88\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.339613 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.408561 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f033101-48ac-4bad-9b29-3547a7c6b721-operator-scripts\") pod \"placement-ff84-account-create-update-7z425\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.409190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn89x\" (UniqueName: \"kubernetes.io/projected/3f033101-48ac-4bad-9b29-3547a7c6b721-kube-api-access-bn89x\") pod \"placement-ff84-account-create-update-7z425\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.412230 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f033101-48ac-4bad-9b29-3547a7c6b721-operator-scripts\") pod \"placement-ff84-account-create-update-7z425\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.426632 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.427385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn89x\" (UniqueName: \"kubernetes.io/projected/3f033101-48ac-4bad-9b29-3547a7c6b721-kube-api-access-bn89x\") pod \"placement-ff84-account-create-update-7z425\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.587116 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzknp" Dec 02 08:03:17 crc kubenswrapper[4691]: I1202 08:03:17.596295 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:18 crc kubenswrapper[4691]: I1202 08:03:18.269872 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-zwd88"] Dec 02 08:03:18 crc kubenswrapper[4691]: I1202 08:03:18.297987 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:18 crc kubenswrapper[4691]: E1202 08:03:18.298151 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 08:03:18 crc kubenswrapper[4691]: E1202 08:03:18.298171 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 08:03:18 crc kubenswrapper[4691]: E1202 08:03:18.298215 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift podName:c25f8b81-a8e1-4035-ae92-209fd4ed5ec0 nodeName:}" failed. No retries permitted until 2025-12-02 08:03:26.298198545 +0000 UTC m=+1054.082277407 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift") pod "swift-storage-0" (UID: "c25f8b81-a8e1-4035-ae92-209fd4ed5ec0") : configmap "swift-ring-files" not found Dec 02 08:03:18 crc kubenswrapper[4691]: I1202 08:03:18.303380 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 02 08:03:18 crc kubenswrapper[4691]: I1202 08:03:18.321823 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-dec3-account-create-update-ncxgc"] Dec 02 08:03:19 crc kubenswrapper[4691]: I1202 08:03:19.544905 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:03:19 crc kubenswrapper[4691]: I1202 08:03:19.611343 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-nmfrn"] Dec 02 08:03:19 crc kubenswrapper[4691]: I1202 08:03:19.611595 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerName="dnsmasq-dns" containerID="cri-o://b20ef335decaa9bb0627629b30b7b182e50a474d52d53de690c6a550968c5d84" gracePeriod=10 Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.416588 4691 generic.go:334] "Generic (PLEG): container finished" podID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerID="b20ef335decaa9bb0627629b30b7b182e50a474d52d53de690c6a550968c5d84" exitCode=0 Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.416642 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" event={"ID":"0f9bb2b1-116d-48e6-a123-ac026f53023e","Type":"ContainerDied","Data":"b20ef335decaa9bb0627629b30b7b182e50a474d52d53de690c6a550968c5d84"} Dec 02 08:03:20 crc kubenswrapper[4691]: W1202 08:03:20.486209 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4821e81_a183_48bc_b558_9e1b64ad63c7.slice/crio-4adc5f03c1f0928db6e38863fac49ea7969df8fe056c94b5ae0cd82f4d264ab5 WatchSource:0}: Error finding container 4adc5f03c1f0928db6e38863fac49ea7969df8fe056c94b5ae0cd82f4d264ab5: Status 404 returned error can't find the container with id 4adc5f03c1f0928db6e38863fac49ea7969df8fe056c94b5ae0cd82f4d264ab5 Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.863247 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.946821 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-nb\") pod \"0f9bb2b1-116d-48e6-a123-ac026f53023e\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.946977 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-dns-svc\") pod \"0f9bb2b1-116d-48e6-a123-ac026f53023e\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.947001 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-config\") pod \"0f9bb2b1-116d-48e6-a123-ac026f53023e\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.947068 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-sb\") pod \"0f9bb2b1-116d-48e6-a123-ac026f53023e\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.947110 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr8pk\" (UniqueName: \"kubernetes.io/projected/0f9bb2b1-116d-48e6-a123-ac026f53023e-kube-api-access-vr8pk\") pod \"0f9bb2b1-116d-48e6-a123-ac026f53023e\" (UID: \"0f9bb2b1-116d-48e6-a123-ac026f53023e\") " Dec 02 08:03:20 crc kubenswrapper[4691]: I1202 08:03:20.953081 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f9bb2b1-116d-48e6-a123-ac026f53023e-kube-api-access-vr8pk" (OuterVolumeSpecName: "kube-api-access-vr8pk") pod "0f9bb2b1-116d-48e6-a123-ac026f53023e" (UID: "0f9bb2b1-116d-48e6-a123-ac026f53023e"). InnerVolumeSpecName "kube-api-access-vr8pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.000146 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0f9bb2b1-116d-48e6-a123-ac026f53023e" (UID: "0f9bb2b1-116d-48e6-a123-ac026f53023e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.027442 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0f9bb2b1-116d-48e6-a123-ac026f53023e" (UID: "0f9bb2b1-116d-48e6-a123-ac026f53023e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.031380 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0f9bb2b1-116d-48e6-a123-ac026f53023e" (UID: "0f9bb2b1-116d-48e6-a123-ac026f53023e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.035595 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-config" (OuterVolumeSpecName: "config") pod "0f9bb2b1-116d-48e6-a123-ac026f53023e" (UID: "0f9bb2b1-116d-48e6-a123-ac026f53023e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.049063 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.049099 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.049111 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.049122 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0f9bb2b1-116d-48e6-a123-ac026f53023e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.049136 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr8pk\" (UniqueName: \"kubernetes.io/projected/0f9bb2b1-116d-48e6-a123-ac026f53023e-kube-api-access-vr8pk\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.099411 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-ff84-account-create-update-7z425"] Dec 02 08:03:21 crc kubenswrapper[4691]: W1202 08:03:21.100574 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3f033101_48ac_4bad_9b29_3547a7c6b721.slice/crio-8409cb00033929f850ba78b34816f4b2d5c6548664b053532a986a206c04ab45 WatchSource:0}: Error finding container 8409cb00033929f850ba78b34816f4b2d5c6548664b053532a986a206c04ab45: Status 404 returned error can't find the container with id 8409cb00033929f850ba78b34816f4b2d5c6548664b053532a986a206c04ab45 Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.200785 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rzknp"] Dec 02 08:03:21 crc kubenswrapper[4691]: W1202 08:03:21.316808 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31da07a4_df34_4c7e_91a1_9847f83ed7fd.slice/crio-5884e363a77d7e9f5a82a1b67d6bbf47e0d9b1db205c3f72537ab6caab662910 WatchSource:0}: Error finding container 5884e363a77d7e9f5a82a1b67d6bbf47e0d9b1db205c3f72537ab6caab662910: Status 404 returned error can't find the container with id 5884e363a77d7e9f5a82a1b67d6bbf47e0d9b1db205c3f72537ab6caab662910 Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.431066 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzknp" event={"ID":"31da07a4-df34-4c7e-91a1-9847f83ed7fd","Type":"ContainerStarted","Data":"5884e363a77d7e9f5a82a1b67d6bbf47e0d9b1db205c3f72537ab6caab662910"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.435926 4691 generic.go:334] "Generic (PLEG): container finished" podID="db56f732-e81f-48e9-8b33-cc147b6ffc99" containerID="948d22f00eea93e493851bc530194ce67e9036824eed4028b52384b4119ac0cb" exitCode=0 Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.436666 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwd88" event={"ID":"db56f732-e81f-48e9-8b33-cc147b6ffc99","Type":"ContainerDied","Data":"948d22f00eea93e493851bc530194ce67e9036824eed4028b52384b4119ac0cb"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.436714 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwd88" event={"ID":"db56f732-e81f-48e9-8b33-cc147b6ffc99","Type":"ContainerStarted","Data":"9bc36b44511d520c9722437755a998969a8e6e0ec19882b1dacc1845422da759"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.445317 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.446946 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-nmfrn" event={"ID":"0f9bb2b1-116d-48e6-a123-ac026f53023e","Type":"ContainerDied","Data":"34a6cdc32ca985bc2fc9634a8caf2a48d1f4da54f318f439a0269deb514f3f4e"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.447234 4691 scope.go:117] "RemoveContainer" containerID="b20ef335decaa9bb0627629b30b7b182e50a474d52d53de690c6a550968c5d84" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.452201 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lchdp" event={"ID":"82672deb-2527-4d18-8006-0f794dfe97c0","Type":"ContainerStarted","Data":"5013954bb97662a2d2d516f8acd58da492f3a45199c5e068ad0bb24d9e78cdbb"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.464645 4691 generic.go:334] "Generic (PLEG): container finished" podID="d4821e81-a183-48bc-b558-9e1b64ad63c7" containerID="599e7369ee44d8258bea4c260c65084d2e146ec3f8647477681bdcb10fb3ae4c" exitCode=0 Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.464730 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dec3-account-create-update-ncxgc" event={"ID":"d4821e81-a183-48bc-b558-9e1b64ad63c7","Type":"ContainerDied","Data":"599e7369ee44d8258bea4c260c65084d2e146ec3f8647477681bdcb10fb3ae4c"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.464803 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dec3-account-create-update-ncxgc" event={"ID":"d4821e81-a183-48bc-b558-9e1b64ad63c7","Type":"ContainerStarted","Data":"4adc5f03c1f0928db6e38863fac49ea7969df8fe056c94b5ae0cd82f4d264ab5"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.467460 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ff84-account-create-update-7z425" event={"ID":"3f033101-48ac-4bad-9b29-3547a7c6b721","Type":"ContainerStarted","Data":"b6029f2730d172ed992afb2aa8e4f965ce51ddf5bb9f5d8a0cc5d7cd8ebe6ba3"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.467511 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ff84-account-create-update-7z425" event={"ID":"3f033101-48ac-4bad-9b29-3547a7c6b721","Type":"ContainerStarted","Data":"8409cb00033929f850ba78b34816f4b2d5c6548664b053532a986a206c04ab45"} Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.499812 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-lchdp" podStartSLOduration=2.233596687 podStartE2EDuration="7.499739792s" podCreationTimestamp="2025-12-02 08:03:14 +0000 UTC" firstStartedPulling="2025-12-02 08:03:15.338321497 +0000 UTC m=+1043.122400359" lastFinishedPulling="2025-12-02 08:03:20.604464602 +0000 UTC m=+1048.388543464" observedRunningTime="2025-12-02 08:03:21.490580552 +0000 UTC m=+1049.274659414" watchObservedRunningTime="2025-12-02 08:03:21.499739792 +0000 UTC m=+1049.283818654" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.518643 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-ff84-account-create-update-7z425" podStartSLOduration=4.518610947 podStartE2EDuration="4.518610947s" podCreationTimestamp="2025-12-02 08:03:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:03:21.512191246 +0000 UTC m=+1049.296270108" watchObservedRunningTime="2025-12-02 08:03:21.518610947 +0000 UTC m=+1049.302689809" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.596875 4691 scope.go:117] "RemoveContainer" containerID="202cb739a749715b596f060a1b5def0a4a6ca9a34d76beee7efb38c01eb8c6a0" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.608538 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-nmfrn"] Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.621596 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-nmfrn"] Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.899384 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.899445 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.899507 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.900265 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29b37d8d63090a5b29435fd6f341a26e6433431bf7160b686913291b1dd9efc2"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:03:21 crc kubenswrapper[4691]: I1202 08:03:21.900341 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://29b37d8d63090a5b29435fd6f341a26e6433431bf7160b686913291b1dd9efc2" gracePeriod=600 Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.481238 4691 generic.go:334] "Generic (PLEG): container finished" podID="31da07a4-df34-4c7e-91a1-9847f83ed7fd" containerID="d751872029579419eeea5baacbc41ef848de61e7a80259cc42dd7016c5e88e85" exitCode=0 Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.481439 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzknp" event={"ID":"31da07a4-df34-4c7e-91a1-9847f83ed7fd","Type":"ContainerDied","Data":"d751872029579419eeea5baacbc41ef848de61e7a80259cc42dd7016c5e88e85"} Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.486789 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="29b37d8d63090a5b29435fd6f341a26e6433431bf7160b686913291b1dd9efc2" exitCode=0 Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.486851 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"29b37d8d63090a5b29435fd6f341a26e6433431bf7160b686913291b1dd9efc2"} Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.486876 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"3c6622064f9b8ca4e8932f776c16ae5af9973dd396f94dcf631a7aa1f00aa037"} Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.486892 4691 scope.go:117] "RemoveContainer" containerID="49c546328dbd8547e0ff1dcfee99f503a31e4448db0773f3ebd91ead3aa35f8b" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.490380 4691 generic.go:334] "Generic (PLEG): container finished" podID="3f033101-48ac-4bad-9b29-3547a7c6b721" containerID="b6029f2730d172ed992afb2aa8e4f965ce51ddf5bb9f5d8a0cc5d7cd8ebe6ba3" exitCode=0 Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.490461 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ff84-account-create-update-7z425" event={"ID":"3f033101-48ac-4bad-9b29-3547a7c6b721","Type":"ContainerDied","Data":"b6029f2730d172ed992afb2aa8e4f965ce51ddf5bb9f5d8a0cc5d7cd8ebe6ba3"} Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.533883 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-c2vdf"] Dec 02 08:03:22 crc kubenswrapper[4691]: E1202 08:03:22.534412 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerName="dnsmasq-dns" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.534438 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerName="dnsmasq-dns" Dec 02 08:03:22 crc kubenswrapper[4691]: E1202 08:03:22.534461 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerName="init" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.534472 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerName="init" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.534697 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" containerName="dnsmasq-dns" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.535543 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.543535 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c2vdf"] Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.590019 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f9bb2b1-116d-48e6-a123-ac026f53023e" path="/var/lib/kubelet/pods/0f9bb2b1-116d-48e6-a123-ac026f53023e/volumes" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.594469 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4640a2f1-1c73-4c41-b389-d956599766e4-operator-scripts\") pod \"glance-db-create-c2vdf\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.595329 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49psd\" (UniqueName: \"kubernetes.io/projected/4640a2f1-1c73-4c41-b389-d956599766e4-kube-api-access-49psd\") pod \"glance-db-create-c2vdf\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.651400 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-b036-account-create-update-z7rxp"] Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.653203 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.665214 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.668324 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b036-account-create-update-z7rxp"] Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.697348 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49psd\" (UniqueName: \"kubernetes.io/projected/4640a2f1-1c73-4c41-b389-d956599766e4-kube-api-access-49psd\") pod \"glance-db-create-c2vdf\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.697438 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb2a273-6a0e-4572-bf19-509df8b07a5f-operator-scripts\") pod \"glance-b036-account-create-update-z7rxp\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.697505 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4640a2f1-1c73-4c41-b389-d956599766e4-operator-scripts\") pod \"glance-db-create-c2vdf\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.697586 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s975\" (UniqueName: \"kubernetes.io/projected/fdb2a273-6a0e-4572-bf19-509df8b07a5f-kube-api-access-8s975\") pod \"glance-b036-account-create-update-z7rxp\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.698796 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4640a2f1-1c73-4c41-b389-d956599766e4-operator-scripts\") pod \"glance-db-create-c2vdf\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.718494 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49psd\" (UniqueName: \"kubernetes.io/projected/4640a2f1-1c73-4c41-b389-d956599766e4-kube-api-access-49psd\") pod \"glance-db-create-c2vdf\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.798616 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8s975\" (UniqueName: \"kubernetes.io/projected/fdb2a273-6a0e-4572-bf19-509df8b07a5f-kube-api-access-8s975\") pod \"glance-b036-account-create-update-z7rxp\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.799000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb2a273-6a0e-4572-bf19-509df8b07a5f-operator-scripts\") pod \"glance-b036-account-create-update-z7rxp\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.799805 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb2a273-6a0e-4572-bf19-509df8b07a5f-operator-scripts\") pod \"glance-b036-account-create-update-z7rxp\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.819698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s975\" (UniqueName: \"kubernetes.io/projected/fdb2a273-6a0e-4572-bf19-509df8b07a5f-kube-api-access-8s975\") pod \"glance-b036-account-create-update-z7rxp\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.871520 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:22 crc kubenswrapper[4691]: I1202 08:03:22.978926 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.006797 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.091029 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.105528 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4821e81-a183-48bc-b558-9e1b64ad63c7-operator-scripts\") pod \"d4821e81-a183-48bc-b558-9e1b64ad63c7\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.105627 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db56f732-e81f-48e9-8b33-cc147b6ffc99-operator-scripts\") pod \"db56f732-e81f-48e9-8b33-cc147b6ffc99\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.105716 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbh9g\" (UniqueName: \"kubernetes.io/projected/db56f732-e81f-48e9-8b33-cc147b6ffc99-kube-api-access-jbh9g\") pod \"db56f732-e81f-48e9-8b33-cc147b6ffc99\" (UID: \"db56f732-e81f-48e9-8b33-cc147b6ffc99\") " Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.105847 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s854k\" (UniqueName: \"kubernetes.io/projected/d4821e81-a183-48bc-b558-9e1b64ad63c7-kube-api-access-s854k\") pod \"d4821e81-a183-48bc-b558-9e1b64ad63c7\" (UID: \"d4821e81-a183-48bc-b558-9e1b64ad63c7\") " Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.106502 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4821e81-a183-48bc-b558-9e1b64ad63c7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4821e81-a183-48bc-b558-9e1b64ad63c7" (UID: "d4821e81-a183-48bc-b558-9e1b64ad63c7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.106522 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db56f732-e81f-48e9-8b33-cc147b6ffc99-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db56f732-e81f-48e9-8b33-cc147b6ffc99" (UID: "db56f732-e81f-48e9-8b33-cc147b6ffc99"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.112982 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db56f732-e81f-48e9-8b33-cc147b6ffc99-kube-api-access-jbh9g" (OuterVolumeSpecName: "kube-api-access-jbh9g") pod "db56f732-e81f-48e9-8b33-cc147b6ffc99" (UID: "db56f732-e81f-48e9-8b33-cc147b6ffc99"). InnerVolumeSpecName "kube-api-access-jbh9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.117544 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4821e81-a183-48bc-b558-9e1b64ad63c7-kube-api-access-s854k" (OuterVolumeSpecName: "kube-api-access-s854k") pod "d4821e81-a183-48bc-b558-9e1b64ad63c7" (UID: "d4821e81-a183-48bc-b558-9e1b64ad63c7"). InnerVolumeSpecName "kube-api-access-s854k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.208061 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s854k\" (UniqueName: \"kubernetes.io/projected/d4821e81-a183-48bc-b558-9e1b64ad63c7-kube-api-access-s854k\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.208100 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4821e81-a183-48bc-b558-9e1b64ad63c7-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.208114 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db56f732-e81f-48e9-8b33-cc147b6ffc99-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.208123 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbh9g\" (UniqueName: \"kubernetes.io/projected/db56f732-e81f-48e9-8b33-cc147b6ffc99-kube-api-access-jbh9g\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.474550 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-c2vdf"] Dec 02 08:03:23 crc kubenswrapper[4691]: W1202 08:03:23.479101 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4640a2f1_1c73_4c41_b389_d956599766e4.slice/crio-253012aba6d1137df9e2f64bef348675d8fb7782685d014290c48a0259aa19f7 WatchSource:0}: Error finding container 253012aba6d1137df9e2f64bef348675d8fb7782685d014290c48a0259aa19f7: Status 404 returned error can't find the container with id 253012aba6d1137df9e2f64bef348675d8fb7782685d014290c48a0259aa19f7 Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.508362 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-dec3-account-create-update-ncxgc" event={"ID":"d4821e81-a183-48bc-b558-9e1b64ad63c7","Type":"ContainerDied","Data":"4adc5f03c1f0928db6e38863fac49ea7969df8fe056c94b5ae0cd82f4d264ab5"} Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.508414 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4adc5f03c1f0928db6e38863fac49ea7969df8fe056c94b5ae0cd82f4d264ab5" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.508377 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-dec3-account-create-update-ncxgc" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.510776 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-zwd88" event={"ID":"db56f732-e81f-48e9-8b33-cc147b6ffc99","Type":"ContainerDied","Data":"9bc36b44511d520c9722437755a998969a8e6e0ec19882b1dacc1845422da759"} Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.510783 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-zwd88" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.510810 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bc36b44511d520c9722437755a998969a8e6e0ec19882b1dacc1845422da759" Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.517571 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c2vdf" event={"ID":"4640a2f1-1c73-4c41-b389-d956599766e4","Type":"ContainerStarted","Data":"253012aba6d1137df9e2f64bef348675d8fb7782685d014290c48a0259aa19f7"} Dec 02 08:03:23 crc kubenswrapper[4691]: I1202 08:03:23.588909 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-b036-account-create-update-z7rxp"] Dec 02 08:03:23 crc kubenswrapper[4691]: W1202 08:03:23.594422 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdb2a273_6a0e_4572_bf19_509df8b07a5f.slice/crio-d7cc8c89942fe86bbc7c078c61a8b3367b35def2739516bc883c40c0782942a3 WatchSource:0}: Error finding container d7cc8c89942fe86bbc7c078c61a8b3367b35def2739516bc883c40c0782942a3: Status 404 returned error can't find the container with id d7cc8c89942fe86bbc7c078c61a8b3367b35def2739516bc883c40c0782942a3 Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.037001 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzknp" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.048007 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.225814 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31da07a4-df34-4c7e-91a1-9847f83ed7fd-operator-scripts\") pod \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.225916 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dckmk\" (UniqueName: \"kubernetes.io/projected/31da07a4-df34-4c7e-91a1-9847f83ed7fd-kube-api-access-dckmk\") pod \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\" (UID: \"31da07a4-df34-4c7e-91a1-9847f83ed7fd\") " Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.225993 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f033101-48ac-4bad-9b29-3547a7c6b721-operator-scripts\") pod \"3f033101-48ac-4bad-9b29-3547a7c6b721\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.226086 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bn89x\" (UniqueName: \"kubernetes.io/projected/3f033101-48ac-4bad-9b29-3547a7c6b721-kube-api-access-bn89x\") pod \"3f033101-48ac-4bad-9b29-3547a7c6b721\" (UID: \"3f033101-48ac-4bad-9b29-3547a7c6b721\") " Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.226851 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31da07a4-df34-4c7e-91a1-9847f83ed7fd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31da07a4-df34-4c7e-91a1-9847f83ed7fd" (UID: "31da07a4-df34-4c7e-91a1-9847f83ed7fd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.227206 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f033101-48ac-4bad-9b29-3547a7c6b721-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3f033101-48ac-4bad-9b29-3547a7c6b721" (UID: "3f033101-48ac-4bad-9b29-3547a7c6b721"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.231180 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f033101-48ac-4bad-9b29-3547a7c6b721-kube-api-access-bn89x" (OuterVolumeSpecName: "kube-api-access-bn89x") pod "3f033101-48ac-4bad-9b29-3547a7c6b721" (UID: "3f033101-48ac-4bad-9b29-3547a7c6b721"). InnerVolumeSpecName "kube-api-access-bn89x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.231865 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31da07a4-df34-4c7e-91a1-9847f83ed7fd-kube-api-access-dckmk" (OuterVolumeSpecName: "kube-api-access-dckmk") pod "31da07a4-df34-4c7e-91a1-9847f83ed7fd" (UID: "31da07a4-df34-4c7e-91a1-9847f83ed7fd"). InnerVolumeSpecName "kube-api-access-dckmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.327815 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31da07a4-df34-4c7e-91a1-9847f83ed7fd-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.327853 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dckmk\" (UniqueName: \"kubernetes.io/projected/31da07a4-df34-4c7e-91a1-9847f83ed7fd-kube-api-access-dckmk\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.327868 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3f033101-48ac-4bad-9b29-3547a7c6b721-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.327879 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bn89x\" (UniqueName: \"kubernetes.io/projected/3f033101-48ac-4bad-9b29-3547a7c6b721-kube-api-access-bn89x\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.527375 4691 generic.go:334] "Generic (PLEG): container finished" podID="fdb2a273-6a0e-4572-bf19-509df8b07a5f" containerID="a4c40e80369dea4bb5ab5b70a7ab69f581d13fb8e549edf9d330336147b2565c" exitCode=0 Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.527426 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b036-account-create-update-z7rxp" event={"ID":"fdb2a273-6a0e-4572-bf19-509df8b07a5f","Type":"ContainerDied","Data":"a4c40e80369dea4bb5ab5b70a7ab69f581d13fb8e549edf9d330336147b2565c"} Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.527496 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b036-account-create-update-z7rxp" event={"ID":"fdb2a273-6a0e-4572-bf19-509df8b07a5f","Type":"ContainerStarted","Data":"d7cc8c89942fe86bbc7c078c61a8b3367b35def2739516bc883c40c0782942a3"} Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.528716 4691 generic.go:334] "Generic (PLEG): container finished" podID="4640a2f1-1c73-4c41-b389-d956599766e4" containerID="ad941f9a593e9dcd53cc0715e80ad6cfe7fa78cb1f78f37efd8b047895ad3a28" exitCode=0 Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.528811 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c2vdf" event={"ID":"4640a2f1-1c73-4c41-b389-d956599766e4","Type":"ContainerDied","Data":"ad941f9a593e9dcd53cc0715e80ad6cfe7fa78cb1f78f37efd8b047895ad3a28"} Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.531012 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-ff84-account-create-update-7z425" event={"ID":"3f033101-48ac-4bad-9b29-3547a7c6b721","Type":"ContainerDied","Data":"8409cb00033929f850ba78b34816f4b2d5c6548664b053532a986a206c04ab45"} Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.531051 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8409cb00033929f850ba78b34816f4b2d5c6548664b053532a986a206c04ab45" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.531064 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-ff84-account-create-update-7z425" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.533129 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rzknp" event={"ID":"31da07a4-df34-4c7e-91a1-9847f83ed7fd","Type":"ContainerDied","Data":"5884e363a77d7e9f5a82a1b67d6bbf47e0d9b1db205c3f72537ab6caab662910"} Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.533157 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5884e363a77d7e9f5a82a1b67d6bbf47e0d9b1db205c3f72537ab6caab662910" Dec 02 08:03:24 crc kubenswrapper[4691]: I1202 08:03:24.533248 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rzknp" Dec 02 08:03:25 crc kubenswrapper[4691]: I1202 08:03:25.976384 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:25 crc kubenswrapper[4691]: I1202 08:03:25.982228 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.057174 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49psd\" (UniqueName: \"kubernetes.io/projected/4640a2f1-1c73-4c41-b389-d956599766e4-kube-api-access-49psd\") pod \"4640a2f1-1c73-4c41-b389-d956599766e4\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.057260 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4640a2f1-1c73-4c41-b389-d956599766e4-operator-scripts\") pod \"4640a2f1-1c73-4c41-b389-d956599766e4\" (UID: \"4640a2f1-1c73-4c41-b389-d956599766e4\") " Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.058310 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4640a2f1-1c73-4c41-b389-d956599766e4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4640a2f1-1c73-4c41-b389-d956599766e4" (UID: "4640a2f1-1c73-4c41-b389-d956599766e4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.079064 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4640a2f1-1c73-4c41-b389-d956599766e4-kube-api-access-49psd" (OuterVolumeSpecName: "kube-api-access-49psd") pod "4640a2f1-1c73-4c41-b389-d956599766e4" (UID: "4640a2f1-1c73-4c41-b389-d956599766e4"). InnerVolumeSpecName "kube-api-access-49psd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.159081 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8s975\" (UniqueName: \"kubernetes.io/projected/fdb2a273-6a0e-4572-bf19-509df8b07a5f-kube-api-access-8s975\") pod \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.159740 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb2a273-6a0e-4572-bf19-509df8b07a5f-operator-scripts\") pod \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\" (UID: \"fdb2a273-6a0e-4572-bf19-509df8b07a5f\") " Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.160418 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49psd\" (UniqueName: \"kubernetes.io/projected/4640a2f1-1c73-4c41-b389-d956599766e4-kube-api-access-49psd\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.160582 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4640a2f1-1c73-4c41-b389-d956599766e4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.160524 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb2a273-6a0e-4572-bf19-509df8b07a5f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fdb2a273-6a0e-4572-bf19-509df8b07a5f" (UID: "fdb2a273-6a0e-4572-bf19-509df8b07a5f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.164008 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb2a273-6a0e-4572-bf19-509df8b07a5f-kube-api-access-8s975" (OuterVolumeSpecName: "kube-api-access-8s975") pod "fdb2a273-6a0e-4572-bf19-509df8b07a5f" (UID: "fdb2a273-6a0e-4572-bf19-509df8b07a5f"). InnerVolumeSpecName "kube-api-access-8s975". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.261831 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8s975\" (UniqueName: \"kubernetes.io/projected/fdb2a273-6a0e-4572-bf19-509df8b07a5f-kube-api-access-8s975\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.261870 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fdb2a273-6a0e-4572-bf19-509df8b07a5f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.410344 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:26 crc kubenswrapper[4691]: E1202 08:03:26.410587 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 02 08:03:26 crc kubenswrapper[4691]: E1202 08:03:26.410627 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 02 08:03:26 crc kubenswrapper[4691]: E1202 08:03:26.410940 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift podName:c25f8b81-a8e1-4035-ae92-209fd4ed5ec0 nodeName:}" failed. No retries permitted until 2025-12-02 08:03:42.410914497 +0000 UTC m=+1070.194993359 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift") pod "swift-storage-0" (UID: "c25f8b81-a8e1-4035-ae92-209fd4ed5ec0") : configmap "swift-ring-files" not found Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.548533 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-c2vdf" event={"ID":"4640a2f1-1c73-4c41-b389-d956599766e4","Type":"ContainerDied","Data":"253012aba6d1137df9e2f64bef348675d8fb7782685d014290c48a0259aa19f7"} Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.548582 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="253012aba6d1137df9e2f64bef348675d8fb7782685d014290c48a0259aa19f7" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.548547 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-c2vdf" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.549724 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-b036-account-create-update-z7rxp" event={"ID":"fdb2a273-6a0e-4572-bf19-509df8b07a5f","Type":"ContainerDied","Data":"d7cc8c89942fe86bbc7c078c61a8b3367b35def2739516bc883c40c0782942a3"} Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.549777 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7cc8c89942fe86bbc7c078c61a8b3367b35def2739516bc883c40c0782942a3" Dec 02 08:03:26 crc kubenswrapper[4691]: I1202 08:03:26.549783 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-b036-account-create-update-z7rxp" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.710950 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-nxcgj"] Dec 02 08:03:27 crc kubenswrapper[4691]: E1202 08:03:27.711956 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4821e81-a183-48bc-b558-9e1b64ad63c7" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.711977 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4821e81-a183-48bc-b558-9e1b64ad63c7" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: E1202 08:03:27.711991 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb2a273-6a0e-4572-bf19-509df8b07a5f" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.711998 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb2a273-6a0e-4572-bf19-509df8b07a5f" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: E1202 08:03:27.712014 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f033101-48ac-4bad-9b29-3547a7c6b721" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712022 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f033101-48ac-4bad-9b29-3547a7c6b721" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: E1202 08:03:27.712038 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4640a2f1-1c73-4c41-b389-d956599766e4" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712044 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4640a2f1-1c73-4c41-b389-d956599766e4" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: E1202 08:03:27.712054 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31da07a4-df34-4c7e-91a1-9847f83ed7fd" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712060 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="31da07a4-df34-4c7e-91a1-9847f83ed7fd" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: E1202 08:03:27.712070 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db56f732-e81f-48e9-8b33-cc147b6ffc99" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712076 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="db56f732-e81f-48e9-8b33-cc147b6ffc99" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712255 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4640a2f1-1c73-4c41-b389-d956599766e4" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712269 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="31da07a4-df34-4c7e-91a1-9847f83ed7fd" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712276 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb2a273-6a0e-4572-bf19-509df8b07a5f" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712288 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4821e81-a183-48bc-b558-9e1b64ad63c7" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712297 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f033101-48ac-4bad-9b29-3547a7c6b721" containerName="mariadb-account-create-update" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.712308 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="db56f732-e81f-48e9-8b33-cc147b6ffc99" containerName="mariadb-database-create" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.713626 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.716193 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.716630 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5ngw7" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.725287 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nxcgj"] Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.846725 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-combined-ca-bundle\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.846799 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-config-data\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.846847 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-db-sync-config-data\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.846882 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pmkk\" (UniqueName: \"kubernetes.io/projected/4e433058-a34d-4156-9a25-07a573d1c4d2-kube-api-access-8pmkk\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.949296 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-db-sync-config-data\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.949374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pmkk\" (UniqueName: \"kubernetes.io/projected/4e433058-a34d-4156-9a25-07a573d1c4d2-kube-api-access-8pmkk\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.949527 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-combined-ca-bundle\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.949546 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-config-data\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.954616 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-db-sync-config-data\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.955043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-config-data\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.955841 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-combined-ca-bundle\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:27 crc kubenswrapper[4691]: I1202 08:03:27.969890 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pmkk\" (UniqueName: \"kubernetes.io/projected/4e433058-a34d-4156-9a25-07a573d1c4d2-kube-api-access-8pmkk\") pod \"glance-db-sync-nxcgj\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.060608 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxcgj" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.397644 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-9jwww" podUID="ed3b8ff0-c19d-4614-abe4-0ad6b5801b78" containerName="ovn-controller" probeResult="failure" output=< Dec 02 08:03:28 crc kubenswrapper[4691]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 02 08:03:28 crc kubenswrapper[4691]: > Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.407097 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.419604 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-twpkf" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.565132 4691 generic.go:334] "Generic (PLEG): container finished" podID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerID="03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8" exitCode=0 Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.572479 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ce7acb7-a140-4c78-a71d-d3c96aa12651","Type":"ContainerDied","Data":"03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8"} Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.652205 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-9jwww-config-wkpsn"] Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.655836 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.660826 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.685920 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jwww-config-wkpsn"] Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.697939 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.698295 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-log-ovn\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.698548 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-additional-scripts\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.698612 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run-ovn\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.698701 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-scripts\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.698822 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xhdr\" (UniqueName: \"kubernetes.io/projected/fa18c911-9d53-494c-a6b8-0633112dfabf-kube-api-access-4xhdr\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.755812 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nxcgj"] Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.800579 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.800704 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-log-ovn\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.800780 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-additional-scripts\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.800806 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run-ovn\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.800842 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-scripts\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.800890 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xhdr\" (UniqueName: \"kubernetes.io/projected/fa18c911-9d53-494c-a6b8-0633112dfabf-kube-api-access-4xhdr\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.801213 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-log-ovn\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.801259 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.801315 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run-ovn\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.801752 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-additional-scripts\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.803175 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-scripts\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.826355 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xhdr\" (UniqueName: \"kubernetes.io/projected/fa18c911-9d53-494c-a6b8-0633112dfabf-kube-api-access-4xhdr\") pod \"ovn-controller-9jwww-config-wkpsn\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:28 crc kubenswrapper[4691]: I1202 08:03:28.989537 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:29 crc kubenswrapper[4691]: W1202 08:03:29.426621 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa18c911_9d53_494c_a6b8_0633112dfabf.slice/crio-ebb2c7753efd304f270743f654d56e27ba90527cff69691c79dd3864980ae9d5 WatchSource:0}: Error finding container ebb2c7753efd304f270743f654d56e27ba90527cff69691c79dd3864980ae9d5: Status 404 returned error can't find the container with id ebb2c7753efd304f270743f654d56e27ba90527cff69691c79dd3864980ae9d5 Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.438467 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-9jwww-config-wkpsn"] Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.633027 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ce7acb7-a140-4c78-a71d-d3c96aa12651","Type":"ContainerStarted","Data":"14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899"} Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.633213 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.634317 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxcgj" event={"ID":"4e433058-a34d-4156-9a25-07a573d1c4d2","Type":"ContainerStarted","Data":"6e679b51c118e6dee47f03cea7ed159e5dd4044bc688c1c59577e13753a8ede1"} Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.635407 4691 generic.go:334] "Generic (PLEG): container finished" podID="82672deb-2527-4d18-8006-0f794dfe97c0" containerID="5013954bb97662a2d2d516f8acd58da492f3a45199c5e068ad0bb24d9e78cdbb" exitCode=0 Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.635443 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lchdp" event={"ID":"82672deb-2527-4d18-8006-0f794dfe97c0","Type":"ContainerDied","Data":"5013954bb97662a2d2d516f8acd58da492f3a45199c5e068ad0bb24d9e78cdbb"} Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.636724 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jwww-config-wkpsn" event={"ID":"fa18c911-9d53-494c-a6b8-0633112dfabf","Type":"ContainerStarted","Data":"ebb2c7753efd304f270743f654d56e27ba90527cff69691c79dd3864980ae9d5"} Dec 02 08:03:29 crc kubenswrapper[4691]: I1202 08:03:29.665940 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=59.650092156 podStartE2EDuration="1m7.66592134s" podCreationTimestamp="2025-12-02 08:02:22 +0000 UTC" firstStartedPulling="2025-12-02 08:02:44.357969381 +0000 UTC m=+1012.142048243" lastFinishedPulling="2025-12-02 08:02:52.373798565 +0000 UTC m=+1020.157877427" observedRunningTime="2025-12-02 08:03:29.655538479 +0000 UTC m=+1057.439617341" watchObservedRunningTime="2025-12-02 08:03:29.66592134 +0000 UTC m=+1057.450000202" Dec 02 08:03:30 crc kubenswrapper[4691]: I1202 08:03:30.748157 4691 generic.go:334] "Generic (PLEG): container finished" podID="fa18c911-9d53-494c-a6b8-0633112dfabf" containerID="e229829d9ac58b537f184ac5848a971c75f0301410a22ce13a9ecbf22402bba2" exitCode=0 Dec 02 08:03:30 crc kubenswrapper[4691]: I1202 08:03:30.749753 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jwww-config-wkpsn" event={"ID":"fa18c911-9d53-494c-a6b8-0633112dfabf","Type":"ContainerDied","Data":"e229829d9ac58b537f184ac5848a971c75f0301410a22ce13a9ecbf22402bba2"} Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.138929 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.171721 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-scripts\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.171790 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-ring-data-devices\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.171851 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-dispersionconf\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.171886 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-swiftconf\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.171960 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-278xl\" (UniqueName: \"kubernetes.io/projected/82672deb-2527-4d18-8006-0f794dfe97c0-kube-api-access-278xl\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.172083 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-combined-ca-bundle\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.172782 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82672deb-2527-4d18-8006-0f794dfe97c0-etc-swift\") pod \"82672deb-2527-4d18-8006-0f794dfe97c0\" (UID: \"82672deb-2527-4d18-8006-0f794dfe97c0\") " Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.172856 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.174111 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82672deb-2527-4d18-8006-0f794dfe97c0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.174776 4691 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/82672deb-2527-4d18-8006-0f794dfe97c0-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.174803 4691 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.178801 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82672deb-2527-4d18-8006-0f794dfe97c0-kube-api-access-278xl" (OuterVolumeSpecName: "kube-api-access-278xl") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "kube-api-access-278xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.182412 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.197704 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-scripts" (OuterVolumeSpecName: "scripts") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.200812 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.211042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82672deb-2527-4d18-8006-0f794dfe97c0" (UID: "82672deb-2527-4d18-8006-0f794dfe97c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.276307 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/82672deb-2527-4d18-8006-0f794dfe97c0-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.276341 4691 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.276352 4691 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.276362 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-278xl\" (UniqueName: \"kubernetes.io/projected/82672deb-2527-4d18-8006-0f794dfe97c0-kube-api-access-278xl\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.276372 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82672deb-2527-4d18-8006-0f794dfe97c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.966161 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-lchdp" event={"ID":"82672deb-2527-4d18-8006-0f794dfe97c0","Type":"ContainerDied","Data":"e6a945c41f18e1c30c623d6a77e9c5cedc22e4037b22f5db1553811c1c580915"} Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.966214 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6a945c41f18e1c30c623d6a77e9c5cedc22e4037b22f5db1553811c1c580915" Dec 02 08:03:31 crc kubenswrapper[4691]: I1202 08:03:31.966223 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-lchdp" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.601485 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767340 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run-ovn\") pod \"fa18c911-9d53-494c-a6b8-0633112dfabf\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767749 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-additional-scripts\") pod \"fa18c911-9d53-494c-a6b8-0633112dfabf\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767574 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "fa18c911-9d53-494c-a6b8-0633112dfabf" (UID: "fa18c911-9d53-494c-a6b8-0633112dfabf"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767812 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-log-ovn\") pod \"fa18c911-9d53-494c-a6b8-0633112dfabf\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767871 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-scripts\") pod \"fa18c911-9d53-494c-a6b8-0633112dfabf\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767928 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xhdr\" (UniqueName: \"kubernetes.io/projected/fa18c911-9d53-494c-a6b8-0633112dfabf-kube-api-access-4xhdr\") pod \"fa18c911-9d53-494c-a6b8-0633112dfabf\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767943 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "fa18c911-9d53-494c-a6b8-0633112dfabf" (UID: "fa18c911-9d53-494c-a6b8-0633112dfabf"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.767966 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run\") pod \"fa18c911-9d53-494c-a6b8-0633112dfabf\" (UID: \"fa18c911-9d53-494c-a6b8-0633112dfabf\") " Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.768300 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run" (OuterVolumeSpecName: "var-run") pod "fa18c911-9d53-494c-a6b8-0633112dfabf" (UID: "fa18c911-9d53-494c-a6b8-0633112dfabf"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.768727 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "fa18c911-9d53-494c-a6b8-0633112dfabf" (UID: "fa18c911-9d53-494c-a6b8-0633112dfabf"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.768992 4691 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.769015 4691 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.769029 4691 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.769043 4691 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/fa18c911-9d53-494c-a6b8-0633112dfabf-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.770429 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-scripts" (OuterVolumeSpecName: "scripts") pod "fa18c911-9d53-494c-a6b8-0633112dfabf" (UID: "fa18c911-9d53-494c-a6b8-0633112dfabf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.790061 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa18c911-9d53-494c-a6b8-0633112dfabf-kube-api-access-4xhdr" (OuterVolumeSpecName: "kube-api-access-4xhdr") pod "fa18c911-9d53-494c-a6b8-0633112dfabf" (UID: "fa18c911-9d53-494c-a6b8-0633112dfabf"). InnerVolumeSpecName "kube-api-access-4xhdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.870735 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa18c911-9d53-494c-a6b8-0633112dfabf-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.870794 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xhdr\" (UniqueName: \"kubernetes.io/projected/fa18c911-9d53-494c-a6b8-0633112dfabf-kube-api-access-4xhdr\") on node \"crc\" DevicePath \"\"" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.978464 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-9jwww-config-wkpsn" event={"ID":"fa18c911-9d53-494c-a6b8-0633112dfabf","Type":"ContainerDied","Data":"ebb2c7753efd304f270743f654d56e27ba90527cff69691c79dd3864980ae9d5"} Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.978506 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ebb2c7753efd304f270743f654d56e27ba90527cff69691c79dd3864980ae9d5" Dec 02 08:03:32 crc kubenswrapper[4691]: I1202 08:03:32.978571 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-9jwww-config-wkpsn" Dec 02 08:03:33 crc kubenswrapper[4691]: I1202 08:03:33.312504 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-9jwww" Dec 02 08:03:33 crc kubenswrapper[4691]: I1202 08:03:33.720596 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-9jwww-config-wkpsn"] Dec 02 08:03:33 crc kubenswrapper[4691]: I1202 08:03:33.730487 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-9jwww-config-wkpsn"] Dec 02 08:03:34 crc kubenswrapper[4691]: I1202 08:03:34.577819 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa18c911-9d53-494c-a6b8-0633112dfabf" path="/var/lib/kubelet/pods/fa18c911-9d53-494c-a6b8-0633112dfabf/volumes" Dec 02 08:03:38 crc kubenswrapper[4691]: I1202 08:03:38.039940 4691 generic.go:334] "Generic (PLEG): container finished" podID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerID="f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925" exitCode=0 Dec 02 08:03:38 crc kubenswrapper[4691]: I1202 08:03:38.040068 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ed4ad29-5963-47aa-ba01-faf16686c61d","Type":"ContainerDied","Data":"f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925"} Dec 02 08:03:42 crc kubenswrapper[4691]: I1202 08:03:42.502924 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:42 crc kubenswrapper[4691]: I1202 08:03:42.510169 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c25f8b81-a8e1-4035-ae92-209fd4ed5ec0-etc-swift\") pod \"swift-storage-0\" (UID: \"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0\") " pod="openstack/swift-storage-0" Dec 02 08:03:42 crc kubenswrapper[4691]: I1202 08:03:42.789135 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 02 08:03:44 crc kubenswrapper[4691]: I1202 08:03:44.129037 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:03:49 crc kubenswrapper[4691]: E1202 08:03:49.504229 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Dec 02 08:03:49 crc kubenswrapper[4691]: E1202 08:03:49.505375 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8pmkk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-nxcgj_openstack(4e433058-a34d-4156-9a25-07a573d1c4d2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:03:49 crc kubenswrapper[4691]: E1202 08:03:49.506960 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-nxcgj" podUID="4e433058-a34d-4156-9a25-07a573d1c4d2" Dec 02 08:03:50 crc kubenswrapper[4691]: I1202 08:03:50.160887 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 02 08:03:50 crc kubenswrapper[4691]: W1202 08:03:50.169532 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc25f8b81_a8e1_4035_ae92_209fd4ed5ec0.slice/crio-7c9b49b85b034f85d09f2f9c88d5ddcf564a060d7a79ac54a40ea0efe8a7d30a WatchSource:0}: Error finding container 7c9b49b85b034f85d09f2f9c88d5ddcf564a060d7a79ac54a40ea0efe8a7d30a: Status 404 returned error can't find the container with id 7c9b49b85b034f85d09f2f9c88d5ddcf564a060d7a79ac54a40ea0efe8a7d30a Dec 02 08:03:50 crc kubenswrapper[4691]: I1202 08:03:50.194862 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"7c9b49b85b034f85d09f2f9c88d5ddcf564a060d7a79ac54a40ea0efe8a7d30a"} Dec 02 08:03:50 crc kubenswrapper[4691]: I1202 08:03:50.197818 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ed4ad29-5963-47aa-ba01-faf16686c61d","Type":"ContainerStarted","Data":"a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1"} Dec 02 08:03:50 crc kubenswrapper[4691]: I1202 08:03:50.198358 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 08:03:50 crc kubenswrapper[4691]: E1202 08:03:50.199273 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-nxcgj" podUID="4e433058-a34d-4156-9a25-07a573d1c4d2" Dec 02 08:03:50 crc kubenswrapper[4691]: I1202 08:03:50.257033 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=79.317256808 podStartE2EDuration="1m28.257013306s" podCreationTimestamp="2025-12-02 08:02:22 +0000 UTC" firstStartedPulling="2025-12-02 08:02:44.83898824 +0000 UTC m=+1012.623067102" lastFinishedPulling="2025-12-02 08:02:53.778744738 +0000 UTC m=+1021.562823600" observedRunningTime="2025-12-02 08:03:50.251308812 +0000 UTC m=+1078.035387684" watchObservedRunningTime="2025-12-02 08:03:50.257013306 +0000 UTC m=+1078.041092168" Dec 02 08:03:52 crc kubenswrapper[4691]: I1202 08:03:52.218819 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"ba43d22bf0824532acc6e6a967e93cb176994036cdf7d94b22ece1358c445510"} Dec 02 08:03:52 crc kubenswrapper[4691]: I1202 08:03:52.219417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"2601d079620e7aab22fad3907e5e77f26a954b28183a5b64d46d9c630a7da39e"} Dec 02 08:03:53 crc kubenswrapper[4691]: I1202 08:03:53.238578 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"bb151a914b54e5d533effb91b5af3d5bf2f1caf3a6eb0571a8beb55b21f6105e"} Dec 02 08:03:53 crc kubenswrapper[4691]: I1202 08:03:53.239148 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"4b869fa6ad1b0cd915b18b1aee9a602043aa5bf96edb55c16afc96fe7f53075c"} Dec 02 08:03:55 crc kubenswrapper[4691]: I1202 08:03:55.257080 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"032ec04edc63f3246a4edbffe33f4c76a34185726ac65cd8ca7c2d10f0d472ab"} Dec 02 08:03:55 crc kubenswrapper[4691]: I1202 08:03:55.257408 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"49b8004756da176cedd204c62c5183f13967cee1fa6646e837c7b4b83faf5885"} Dec 02 08:03:55 crc kubenswrapper[4691]: I1202 08:03:55.257422 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"d5b0f675574771974ccf3a93d7b0def29b89f992e6d64b78146594be76aaf368"} Dec 02 08:03:56 crc kubenswrapper[4691]: I1202 08:03:56.276491 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"04eccfcc6ef8a468150083c996e41672cf2070b9c5cc326f2be5ba7d6e7969e3"} Dec 02 08:03:56 crc kubenswrapper[4691]: I1202 08:03:56.993499 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="aa4f9395-a46a-40e4-a80c-c9b43caadc0b" containerName="galera" probeResult="failure" output="command timed out" Dec 02 08:03:59 crc kubenswrapper[4691]: I1202 08:03:59.347513 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"7af3589376033d48bb041d7d6243d42bb412fe98ef675afe53919f12c95fc5e8"} Dec 02 08:03:59 crc kubenswrapper[4691]: I1202 08:03:59.348332 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"b78733a73894860b0981d7f12b489a52c1dbc44088785012cb5637da77647c2e"} Dec 02 08:03:59 crc kubenswrapper[4691]: I1202 08:03:59.348364 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"ca06417476c7283dfc902e8c67eed10db9da3fc76a5f89e9c5610a7cf5d9270a"} Dec 02 08:04:00 crc kubenswrapper[4691]: I1202 08:04:00.518678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"57eac6a6f995a73b9a758e05e23af3cb9f6c6e616e3c9b0928b11a6ff144d611"} Dec 02 08:04:00 crc kubenswrapper[4691]: I1202 08:04:00.519007 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"a317901cce54ecf33271beb8988ff33f8e925bad3b788b471b42313a22b8090f"} Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.534530 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"7cf9a21d74a92d07131b128d630407a838acac07f4703454307ede1da4a12c58"} Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.534862 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c25f8b81-a8e1-4035-ae92-209fd4ed5ec0","Type":"ContainerStarted","Data":"52e5324cd19ae84c3feef705fb98d7793a28fd011918ff105ecd075a9da82eb1"} Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.573592 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=44.563850491 podStartE2EDuration="52.573566383s" podCreationTimestamp="2025-12-02 08:03:09 +0000 UTC" firstStartedPulling="2025-12-02 08:03:50.172200562 +0000 UTC m=+1077.956279424" lastFinishedPulling="2025-12-02 08:03:58.181916454 +0000 UTC m=+1085.965995316" observedRunningTime="2025-12-02 08:04:01.569653485 +0000 UTC m=+1089.353732377" watchObservedRunningTime="2025-12-02 08:04:01.573566383 +0000 UTC m=+1089.357645245" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.852128 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-29vqc"] Dec 02 08:04:01 crc kubenswrapper[4691]: E1202 08:04:01.852576 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa18c911-9d53-494c-a6b8-0633112dfabf" containerName="ovn-config" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.852598 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa18c911-9d53-494c-a6b8-0633112dfabf" containerName="ovn-config" Dec 02 08:04:01 crc kubenswrapper[4691]: E1202 08:04:01.852617 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82672deb-2527-4d18-8006-0f794dfe97c0" containerName="swift-ring-rebalance" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.852626 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="82672deb-2527-4d18-8006-0f794dfe97c0" containerName="swift-ring-rebalance" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.852873 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="82672deb-2527-4d18-8006-0f794dfe97c0" containerName="swift-ring-rebalance" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.852931 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa18c911-9d53-494c-a6b8-0633112dfabf" containerName="ovn-config" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.854024 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.864650 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.874469 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-29vqc"] Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.974148 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-config\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.974252 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5qnb\" (UniqueName: \"kubernetes.io/projected/7fc34093-aa4d-4e35-992b-c071d31705d5-kube-api-access-c5qnb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.974288 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.974324 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.974344 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-svc\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:01 crc kubenswrapper[4691]: I1202 08:04:01.974365 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.076163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-config\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.076512 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5qnb\" (UniqueName: \"kubernetes.io/projected/7fc34093-aa4d-4e35-992b-c071d31705d5-kube-api-access-c5qnb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.076644 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.076780 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.076901 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-svc\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.077021 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.077644 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.077741 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.077863 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-svc\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.078174 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.078522 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-config\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.099826 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5qnb\" (UniqueName: \"kubernetes.io/projected/7fc34093-aa4d-4e35-992b-c071d31705d5-kube-api-access-c5qnb\") pod \"dnsmasq-dns-764c5664d7-29vqc\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.173081 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:02 crc kubenswrapper[4691]: I1202 08:04:02.639613 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-29vqc"] Dec 02 08:04:02 crc kubenswrapper[4691]: W1202 08:04:02.644086 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7fc34093_aa4d_4e35_992b_c071d31705d5.slice/crio-a02e4dfb5e36f1b0af7065957e3ee7bdec2f7e49b4656239541f8eb8eac7ef21 WatchSource:0}: Error finding container a02e4dfb5e36f1b0af7065957e3ee7bdec2f7e49b4656239541f8eb8eac7ef21: Status 404 returned error can't find the container with id a02e4dfb5e36f1b0af7065957e3ee7bdec2f7e49b4656239541f8eb8eac7ef21 Dec 02 08:04:03 crc kubenswrapper[4691]: I1202 08:04:03.560714 4691 generic.go:334] "Generic (PLEG): container finished" podID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerID="e524a5b59a0465f5b3325213907faa9317da3ac7ddb909d796c2aaf09952f508" exitCode=0 Dec 02 08:04:03 crc kubenswrapper[4691]: I1202 08:04:03.560835 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" event={"ID":"7fc34093-aa4d-4e35-992b-c071d31705d5","Type":"ContainerDied","Data":"e524a5b59a0465f5b3325213907faa9317da3ac7ddb909d796c2aaf09952f508"} Dec 02 08:04:03 crc kubenswrapper[4691]: I1202 08:04:03.561631 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" event={"ID":"7fc34093-aa4d-4e35-992b-c071d31705d5","Type":"ContainerStarted","Data":"a02e4dfb5e36f1b0af7065957e3ee7bdec2f7e49b4656239541f8eb8eac7ef21"} Dec 02 08:04:03 crc kubenswrapper[4691]: I1202 08:04:03.820928 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.301595 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-hqmqz"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.303029 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.330904 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hqmqz"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.397538 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-f40f-account-create-update-9lqsp"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.398850 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.401539 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.408807 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f40f-account-create-update-9lqsp"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.418213 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq2z4\" (UniqueName: \"kubernetes.io/projected/ad884439-9cac-4888-902a-81c30359a9b9-kube-api-access-wq2z4\") pod \"barbican-db-create-hqmqz\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.418340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad884439-9cac-4888-902a-81c30359a9b9-operator-scripts\") pod \"barbican-db-create-hqmqz\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.519695 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ec2865-a83a-43c0-a786-e5a22b7a7008-operator-scripts\") pod \"barbican-f40f-account-create-update-9lqsp\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.519781 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad884439-9cac-4888-902a-81c30359a9b9-operator-scripts\") pod \"barbican-db-create-hqmqz\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.520208 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ww7\" (UniqueName: \"kubernetes.io/projected/62ec2865-a83a-43c0-a786-e5a22b7a7008-kube-api-access-d5ww7\") pod \"barbican-f40f-account-create-update-9lqsp\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.520343 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq2z4\" (UniqueName: \"kubernetes.io/projected/ad884439-9cac-4888-902a-81c30359a9b9-kube-api-access-wq2z4\") pod \"barbican-db-create-hqmqz\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.520599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad884439-9cac-4888-902a-81c30359a9b9-operator-scripts\") pod \"barbican-db-create-hqmqz\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.543839 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq2z4\" (UniqueName: \"kubernetes.io/projected/ad884439-9cac-4888-902a-81c30359a9b9-kube-api-access-wq2z4\") pod \"barbican-db-create-hqmqz\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.584337 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.584426 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" event={"ID":"7fc34093-aa4d-4e35-992b-c071d31705d5","Type":"ContainerStarted","Data":"6e1fec54f20d4f1739f4c040799a33a98a9ef900f8d37ccc9d4b9737485f74e7"} Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.622487 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ww7\" (UniqueName: \"kubernetes.io/projected/62ec2865-a83a-43c0-a786-e5a22b7a7008-kube-api-access-d5ww7\") pod \"barbican-f40f-account-create-update-9lqsp\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.622683 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ec2865-a83a-43c0-a786-e5a22b7a7008-operator-scripts\") pod \"barbican-f40f-account-create-update-9lqsp\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.623777 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ec2865-a83a-43c0-a786-e5a22b7a7008-operator-scripts\") pod \"barbican-f40f-account-create-update-9lqsp\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.625512 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.644829 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-lwfnn"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.646326 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.653348 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lwfnn"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.655657 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ww7\" (UniqueName: \"kubernetes.io/projected/62ec2865-a83a-43c0-a786-e5a22b7a7008-kube-api-access-d5ww7\") pod \"barbican-f40f-account-create-update-9lqsp\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.686961 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podStartSLOduration=3.686928853 podStartE2EDuration="3.686928853s" podCreationTimestamp="2025-12-02 08:04:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:04.676958892 +0000 UTC m=+1092.461037764" watchObservedRunningTime="2025-12-02 08:04:04.686928853 +0000 UTC m=+1092.471007715" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.727973 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97cw9\" (UniqueName: \"kubernetes.io/projected/92913a64-59ad-47ca-ad56-a2ad01fbc281-kube-api-access-97cw9\") pod \"cinder-db-create-lwfnn\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.728102 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92913a64-59ad-47ca-ad56-a2ad01fbc281-operator-scripts\") pod \"cinder-db-create-lwfnn\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.743845 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-c46db"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.744990 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c46db" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.766403 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b89c-account-create-update-5qdcl"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.768156 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.770897 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.775908 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c46db"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.790625 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.813942 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b89c-account-create-update-5qdcl"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.829883 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92913a64-59ad-47ca-ad56-a2ad01fbc281-operator-scripts\") pod \"cinder-db-create-lwfnn\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.829959 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjcs8\" (UniqueName: \"kubernetes.io/projected/7425c462-167a-4f21-9eee-afb9b7b1767e-kube-api-access-pjcs8\") pod \"cinder-b89c-account-create-update-5qdcl\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.829983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7425c462-167a-4f21-9eee-afb9b7b1767e-operator-scripts\") pod \"cinder-b89c-account-create-update-5qdcl\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.830028 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrdgx\" (UniqueName: \"kubernetes.io/projected/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-kube-api-access-xrdgx\") pod \"neutron-db-create-c46db\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " pod="openstack/neutron-db-create-c46db" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.830049 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-operator-scripts\") pod \"neutron-db-create-c46db\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " pod="openstack/neutron-db-create-c46db" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.830099 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97cw9\" (UniqueName: \"kubernetes.io/projected/92913a64-59ad-47ca-ad56-a2ad01fbc281-kube-api-access-97cw9\") pod \"cinder-db-create-lwfnn\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.831106 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92913a64-59ad-47ca-ad56-a2ad01fbc281-operator-scripts\") pod \"cinder-db-create-lwfnn\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.857399 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97cw9\" (UniqueName: \"kubernetes.io/projected/92913a64-59ad-47ca-ad56-a2ad01fbc281-kube-api-access-97cw9\") pod \"cinder-db-create-lwfnn\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.874086 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-nxz46"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.875506 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.883420 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.883679 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lmmt4" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.883955 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.884115 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.886418 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nxz46"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.905950 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76a8-account-create-update-bmt2m"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.909445 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.912895 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76a8-account-create-update-bmt2m"] Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.913155 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931157 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjcs8\" (UniqueName: \"kubernetes.io/projected/7425c462-167a-4f21-9eee-afb9b7b1767e-kube-api-access-pjcs8\") pod \"cinder-b89c-account-create-update-5qdcl\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931207 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-combined-ca-bundle\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931238 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7425c462-167a-4f21-9eee-afb9b7b1767e-operator-scripts\") pod \"cinder-b89c-account-create-update-5qdcl\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrdgx\" (UniqueName: \"kubernetes.io/projected/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-kube-api-access-xrdgx\") pod \"neutron-db-create-c46db\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " pod="openstack/neutron-db-create-c46db" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931308 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-operator-scripts\") pod \"neutron-db-create-c46db\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " pod="openstack/neutron-db-create-c46db" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931351 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-config-data\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.931373 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjbd4\" (UniqueName: \"kubernetes.io/projected/c9e8994b-8c3a-4086-babd-3e79a67aae9b-kube-api-access-xjbd4\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.933615 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7425c462-167a-4f21-9eee-afb9b7b1767e-operator-scripts\") pod \"cinder-b89c-account-create-update-5qdcl\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:04 crc kubenswrapper[4691]: I1202 08:04:04.936109 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-operator-scripts\") pod \"neutron-db-create-c46db\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " pod="openstack/neutron-db-create-c46db" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.002900 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjcs8\" (UniqueName: \"kubernetes.io/projected/7425c462-167a-4f21-9eee-afb9b7b1767e-kube-api-access-pjcs8\") pod \"cinder-b89c-account-create-update-5qdcl\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.008265 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrdgx\" (UniqueName: \"kubernetes.io/projected/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-kube-api-access-xrdgx\") pod \"neutron-db-create-c46db\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " pod="openstack/neutron-db-create-c46db" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.032641 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-config-data\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.032716 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjbd4\" (UniqueName: \"kubernetes.io/projected/c9e8994b-8c3a-4086-babd-3e79a67aae9b-kube-api-access-xjbd4\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.033036 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-combined-ca-bundle\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.038522 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-combined-ca-bundle\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.048187 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-config-data\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.068062 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c46db" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.076348 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjbd4\" (UniqueName: \"kubernetes.io/projected/c9e8994b-8c3a-4086-babd-3e79a67aae9b-kube-api-access-xjbd4\") pod \"keystone-db-sync-nxz46\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.079226 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.088420 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.134797 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csm4s\" (UniqueName: \"kubernetes.io/projected/b465b231-332d-4086-a3eb-dcf8f02278b8-kube-api-access-csm4s\") pod \"neutron-76a8-account-create-update-bmt2m\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.135216 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b465b231-332d-4086-a3eb-dcf8f02278b8-operator-scripts\") pod \"neutron-76a8-account-create-update-bmt2m\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.347717 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csm4s\" (UniqueName: \"kubernetes.io/projected/b465b231-332d-4086-a3eb-dcf8f02278b8-kube-api-access-csm4s\") pod \"neutron-76a8-account-create-update-bmt2m\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.347812 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b465b231-332d-4086-a3eb-dcf8f02278b8-operator-scripts\") pod \"neutron-76a8-account-create-update-bmt2m\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.348834 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b465b231-332d-4086-a3eb-dcf8f02278b8-operator-scripts\") pod \"neutron-76a8-account-create-update-bmt2m\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.349267 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.404441 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csm4s\" (UniqueName: \"kubernetes.io/projected/b465b231-332d-4086-a3eb-dcf8f02278b8-kube-api-access-csm4s\") pod \"neutron-76a8-account-create-update-bmt2m\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.567122 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.638067 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-hqmqz"] Dec 02 08:04:05 crc kubenswrapper[4691]: I1202 08:04:05.750468 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-f40f-account-create-update-9lqsp"] Dec 02 08:04:05 crc kubenswrapper[4691]: W1202 08:04:05.769022 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62ec2865_a83a_43c0_a786_e5a22b7a7008.slice/crio-007359f14d17d9972584e72970a3a5cef2bea77f9fa90bd3cbacd43bf6e77493 WatchSource:0}: Error finding container 007359f14d17d9972584e72970a3a5cef2bea77f9fa90bd3cbacd43bf6e77493: Status 404 returned error can't find the container with id 007359f14d17d9972584e72970a3a5cef2bea77f9fa90bd3cbacd43bf6e77493 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.027660 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b89c-account-create-update-5qdcl"] Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.053362 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-c46db"] Dec 02 08:04:06 crc kubenswrapper[4691]: W1202 08:04:06.067992 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c376c7e_d8b9_4f7d_943b_ed3300a60c2d.slice/crio-e798d2a382877b91816de6e6f135c81a9d6758ac82cd513672958c3b2213b478 WatchSource:0}: Error finding container e798d2a382877b91816de6e6f135c81a9d6758ac82cd513672958c3b2213b478: Status 404 returned error can't find the container with id e798d2a382877b91816de6e6f135c81a9d6758ac82cd513672958c3b2213b478 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.219042 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-nxz46"] Dec 02 08:04:06 crc kubenswrapper[4691]: W1202 08:04:06.239530 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9e8994b_8c3a_4086_babd_3e79a67aae9b.slice/crio-b6c86fbedbcf5e94ba657734a21476abe97e3caf9c81f3f96065ed1049f527f4 WatchSource:0}: Error finding container b6c86fbedbcf5e94ba657734a21476abe97e3caf9c81f3f96065ed1049f527f4: Status 404 returned error can't find the container with id b6c86fbedbcf5e94ba657734a21476abe97e3caf9c81f3f96065ed1049f527f4 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.320456 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-lwfnn"] Dec 02 08:04:06 crc kubenswrapper[4691]: W1202 08:04:06.330170 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92913a64_59ad_47ca_ad56_a2ad01fbc281.slice/crio-c3adb38fb2e67b08cb32f09bb946bcbd3a47a9e8287ccd9e5401861f183616c1 WatchSource:0}: Error finding container c3adb38fb2e67b08cb32f09bb946bcbd3a47a9e8287ccd9e5401861f183616c1: Status 404 returned error can't find the container with id c3adb38fb2e67b08cb32f09bb946bcbd3a47a9e8287ccd9e5401861f183616c1 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.408980 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76a8-account-create-update-bmt2m"] Dec 02 08:04:06 crc kubenswrapper[4691]: W1202 08:04:06.458016 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb465b231_332d_4086_a3eb_dcf8f02278b8.slice/crio-676067f720dbf1bf659a7bd450f7c285f76fa4c6dfb7e494747c06bcfdf7a241 WatchSource:0}: Error finding container 676067f720dbf1bf659a7bd450f7c285f76fa4c6dfb7e494747c06bcfdf7a241: Status 404 returned error can't find the container with id 676067f720dbf1bf659a7bd450f7c285f76fa4c6dfb7e494747c06bcfdf7a241 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.645426 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lwfnn" event={"ID":"92913a64-59ad-47ca-ad56-a2ad01fbc281","Type":"ContainerStarted","Data":"4727e5fb0790a48545f8ffb77ea8c8ae35747dccd94f7c8422e2c2d82ab566b8"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.645565 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lwfnn" event={"ID":"92913a64-59ad-47ca-ad56-a2ad01fbc281","Type":"ContainerStarted","Data":"c3adb38fb2e67b08cb32f09bb946bcbd3a47a9e8287ccd9e5401861f183616c1"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.650295 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nxz46" event={"ID":"c9e8994b-8c3a-4086-babd-3e79a67aae9b","Type":"ContainerStarted","Data":"b6c86fbedbcf5e94ba657734a21476abe97e3caf9c81f3f96065ed1049f527f4"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.662826 4691 generic.go:334] "Generic (PLEG): container finished" podID="62ec2865-a83a-43c0-a786-e5a22b7a7008" containerID="db04fb8cda17884a359e715d4b327344ffa90fbc34a3fa6c8386d8f09bee61ea" exitCode=0 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.663053 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f40f-account-create-update-9lqsp" event={"ID":"62ec2865-a83a-43c0-a786-e5a22b7a7008","Type":"ContainerDied","Data":"db04fb8cda17884a359e715d4b327344ffa90fbc34a3fa6c8386d8f09bee61ea"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.663081 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f40f-account-create-update-9lqsp" event={"ID":"62ec2865-a83a-43c0-a786-e5a22b7a7008","Type":"ContainerStarted","Data":"007359f14d17d9972584e72970a3a5cef2bea77f9fa90bd3cbacd43bf6e77493"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.666201 4691 generic.go:334] "Generic (PLEG): container finished" podID="ad884439-9cac-4888-902a-81c30359a9b9" containerID="7d27aa6e1a6a0f5ae69df5a48c3d5b0eb21d285919a5bf79167f074d28837bac" exitCode=0 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.666359 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hqmqz" event={"ID":"ad884439-9cac-4888-902a-81c30359a9b9","Type":"ContainerDied","Data":"7d27aa6e1a6a0f5ae69df5a48c3d5b0eb21d285919a5bf79167f074d28837bac"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.666397 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hqmqz" event={"ID":"ad884439-9cac-4888-902a-81c30359a9b9","Type":"ContainerStarted","Data":"e6e58abe3649d0c50ecefcf8ce4e4a17bf255620e6a61e00eecebcd86e94d1b0"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.673194 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-lwfnn" podStartSLOduration=2.6731750290000003 podStartE2EDuration="2.673175029s" podCreationTimestamp="2025-12-02 08:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:06.665059695 +0000 UTC m=+1094.449138547" watchObservedRunningTime="2025-12-02 08:04:06.673175029 +0000 UTC m=+1094.457253891" Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.676340 4691 generic.go:334] "Generic (PLEG): container finished" podID="3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" containerID="f1b59aa075304131bb6bbf8aa041011b103f08d90ce36861d16bb40fd71cb070" exitCode=0 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.676475 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c46db" event={"ID":"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d","Type":"ContainerDied","Data":"f1b59aa075304131bb6bbf8aa041011b103f08d90ce36861d16bb40fd71cb070"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.676515 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c46db" event={"ID":"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d","Type":"ContainerStarted","Data":"e798d2a382877b91816de6e6f135c81a9d6758ac82cd513672958c3b2213b478"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.692432 4691 generic.go:334] "Generic (PLEG): container finished" podID="7425c462-167a-4f21-9eee-afb9b7b1767e" containerID="8a298c05f7b53b3e8cd6b0bbaafb5d9be090886f3e1a82502b4ab3ae6c9c34bb" exitCode=0 Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.692575 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89c-account-create-update-5qdcl" event={"ID":"7425c462-167a-4f21-9eee-afb9b7b1767e","Type":"ContainerDied","Data":"8a298c05f7b53b3e8cd6b0bbaafb5d9be090886f3e1a82502b4ab3ae6c9c34bb"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.692606 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89c-account-create-update-5qdcl" event={"ID":"7425c462-167a-4f21-9eee-afb9b7b1767e","Type":"ContainerStarted","Data":"899e01e9806aed181945975240a5d309b35883ef1160924b8def1dbcfa15cc00"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.698845 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76a8-account-create-update-bmt2m" event={"ID":"b465b231-332d-4086-a3eb-dcf8f02278b8","Type":"ContainerStarted","Data":"676067f720dbf1bf659a7bd450f7c285f76fa4c6dfb7e494747c06bcfdf7a241"} Dec 02 08:04:06 crc kubenswrapper[4691]: I1202 08:04:06.784247 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76a8-account-create-update-bmt2m" podStartSLOduration=2.784228042 podStartE2EDuration="2.784228042s" podCreationTimestamp="2025-12-02 08:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:06.774863067 +0000 UTC m=+1094.558941919" watchObservedRunningTime="2025-12-02 08:04:06.784228042 +0000 UTC m=+1094.568306914" Dec 02 08:04:07 crc kubenswrapper[4691]: I1202 08:04:07.715215 4691 generic.go:334] "Generic (PLEG): container finished" podID="b465b231-332d-4086-a3eb-dcf8f02278b8" containerID="2b217d80dc4558ad11c48b8693c2a8f77a7e90e6b4edc082aaa97eeef516ccd9" exitCode=0 Dec 02 08:04:07 crc kubenswrapper[4691]: I1202 08:04:07.715365 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76a8-account-create-update-bmt2m" event={"ID":"b465b231-332d-4086-a3eb-dcf8f02278b8","Type":"ContainerDied","Data":"2b217d80dc4558ad11c48b8693c2a8f77a7e90e6b4edc082aaa97eeef516ccd9"} Dec 02 08:04:07 crc kubenswrapper[4691]: I1202 08:04:07.722807 4691 generic.go:334] "Generic (PLEG): container finished" podID="92913a64-59ad-47ca-ad56-a2ad01fbc281" containerID="4727e5fb0790a48545f8ffb77ea8c8ae35747dccd94f7c8422e2c2d82ab566b8" exitCode=0 Dec 02 08:04:07 crc kubenswrapper[4691]: I1202 08:04:07.723042 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lwfnn" event={"ID":"92913a64-59ad-47ca-ad56-a2ad01fbc281","Type":"ContainerDied","Data":"4727e5fb0790a48545f8ffb77ea8c8ae35747dccd94f7c8422e2c2d82ab566b8"} Dec 02 08:04:07 crc kubenswrapper[4691]: I1202 08:04:07.729247 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxcgj" event={"ID":"4e433058-a34d-4156-9a25-07a573d1c4d2","Type":"ContainerStarted","Data":"77f128d5bae6769cd4c4a22a70e7c45068adcec9139a6fa01b26443bbacd4909"} Dec 02 08:04:07 crc kubenswrapper[4691]: I1202 08:04:07.752830 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-nxcgj" podStartSLOduration=3.416357597 podStartE2EDuration="40.752808057s" podCreationTimestamp="2025-12-02 08:03:27 +0000 UTC" firstStartedPulling="2025-12-02 08:03:28.767585321 +0000 UTC m=+1056.551664183" lastFinishedPulling="2025-12-02 08:04:06.104035771 +0000 UTC m=+1093.888114643" observedRunningTime="2025-12-02 08:04:07.752336475 +0000 UTC m=+1095.536415347" watchObservedRunningTime="2025-12-02 08:04:07.752808057 +0000 UTC m=+1095.536886939" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.211528 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.253719 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ww7\" (UniqueName: \"kubernetes.io/projected/62ec2865-a83a-43c0-a786-e5a22b7a7008-kube-api-access-d5ww7\") pod \"62ec2865-a83a-43c0-a786-e5a22b7a7008\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.254091 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ec2865-a83a-43c0-a786-e5a22b7a7008-operator-scripts\") pod \"62ec2865-a83a-43c0-a786-e5a22b7a7008\" (UID: \"62ec2865-a83a-43c0-a786-e5a22b7a7008\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.255491 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62ec2865-a83a-43c0-a786-e5a22b7a7008-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62ec2865-a83a-43c0-a786-e5a22b7a7008" (UID: "62ec2865-a83a-43c0-a786-e5a22b7a7008"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.357468 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62ec2865-a83a-43c0-a786-e5a22b7a7008-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.369975 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62ec2865-a83a-43c0-a786-e5a22b7a7008-kube-api-access-d5ww7" (OuterVolumeSpecName: "kube-api-access-d5ww7") pod "62ec2865-a83a-43c0-a786-e5a22b7a7008" (UID: "62ec2865-a83a-43c0-a786-e5a22b7a7008"). InnerVolumeSpecName "kube-api-access-d5ww7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.450298 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.458286 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7425c462-167a-4f21-9eee-afb9b7b1767e-operator-scripts\") pod \"7425c462-167a-4f21-9eee-afb9b7b1767e\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.458432 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjcs8\" (UniqueName: \"kubernetes.io/projected/7425c462-167a-4f21-9eee-afb9b7b1767e-kube-api-access-pjcs8\") pod \"7425c462-167a-4f21-9eee-afb9b7b1767e\" (UID: \"7425c462-167a-4f21-9eee-afb9b7b1767e\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.458701 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ww7\" (UniqueName: \"kubernetes.io/projected/62ec2865-a83a-43c0-a786-e5a22b7a7008-kube-api-access-d5ww7\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.459610 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7425c462-167a-4f21-9eee-afb9b7b1767e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7425c462-167a-4f21-9eee-afb9b7b1767e" (UID: "7425c462-167a-4f21-9eee-afb9b7b1767e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.460332 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.463672 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7425c462-167a-4f21-9eee-afb9b7b1767e-kube-api-access-pjcs8" (OuterVolumeSpecName: "kube-api-access-pjcs8") pod "7425c462-167a-4f21-9eee-afb9b7b1767e" (UID: "7425c462-167a-4f21-9eee-afb9b7b1767e"). InnerVolumeSpecName "kube-api-access-pjcs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.473004 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c46db" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.559041 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrdgx\" (UniqueName: \"kubernetes.io/projected/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-kube-api-access-xrdgx\") pod \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.559419 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad884439-9cac-4888-902a-81c30359a9b9-operator-scripts\") pod \"ad884439-9cac-4888-902a-81c30359a9b9\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.559575 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-operator-scripts\") pod \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\" (UID: \"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.559684 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq2z4\" (UniqueName: \"kubernetes.io/projected/ad884439-9cac-4888-902a-81c30359a9b9-kube-api-access-wq2z4\") pod \"ad884439-9cac-4888-902a-81c30359a9b9\" (UID: \"ad884439-9cac-4888-902a-81c30359a9b9\") " Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.560091 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjcs8\" (UniqueName: \"kubernetes.io/projected/7425c462-167a-4f21-9eee-afb9b7b1767e-kube-api-access-pjcs8\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.560179 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7425c462-167a-4f21-9eee-afb9b7b1767e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.562420 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad884439-9cac-4888-902a-81c30359a9b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ad884439-9cac-4888-902a-81c30359a9b9" (UID: "ad884439-9cac-4888-902a-81c30359a9b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.564378 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" (UID: "3c376c7e-d8b9-4f7d-943b-ed3300a60c2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.565096 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad884439-9cac-4888-902a-81c30359a9b9-kube-api-access-wq2z4" (OuterVolumeSpecName: "kube-api-access-wq2z4") pod "ad884439-9cac-4888-902a-81c30359a9b9" (UID: "ad884439-9cac-4888-902a-81c30359a9b9"). InnerVolumeSpecName "kube-api-access-wq2z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.569021 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-kube-api-access-xrdgx" (OuterVolumeSpecName: "kube-api-access-xrdgx") pod "3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" (UID: "3c376c7e-d8b9-4f7d-943b-ed3300a60c2d"). InnerVolumeSpecName "kube-api-access-xrdgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:08 crc kubenswrapper[4691]: E1202 08:04:08.643859 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podad884439_9cac_4888_902a_81c30359a9b9.slice/crio-e6e58abe3649d0c50ecefcf8ce4e4a17bf255620e6a61e00eecebcd86e94d1b0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7425c462_167a_4f21_9eee_afb9b7b1767e.slice/crio-899e01e9806aed181945975240a5d309b35883ef1160924b8def1dbcfa15cc00\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62ec2865_a83a_43c0_a786_e5a22b7a7008.slice\": RecentStats: unable to find data in memory cache]" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.662408 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.662438 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq2z4\" (UniqueName: \"kubernetes.io/projected/ad884439-9cac-4888-902a-81c30359a9b9-kube-api-access-wq2z4\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.662450 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrdgx\" (UniqueName: \"kubernetes.io/projected/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d-kube-api-access-xrdgx\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.662459 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ad884439-9cac-4888-902a-81c30359a9b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.739137 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-f40f-account-create-update-9lqsp" event={"ID":"62ec2865-a83a-43c0-a786-e5a22b7a7008","Type":"ContainerDied","Data":"007359f14d17d9972584e72970a3a5cef2bea77f9fa90bd3cbacd43bf6e77493"} Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.740057 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="007359f14d17d9972584e72970a3a5cef2bea77f9fa90bd3cbacd43bf6e77493" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.740161 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-f40f-account-create-update-9lqsp" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.742832 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-hqmqz" event={"ID":"ad884439-9cac-4888-902a-81c30359a9b9","Type":"ContainerDied","Data":"e6e58abe3649d0c50ecefcf8ce4e4a17bf255620e6a61e00eecebcd86e94d1b0"} Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.742919 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e58abe3649d0c50ecefcf8ce4e4a17bf255620e6a61e00eecebcd86e94d1b0" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.743021 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-hqmqz" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.750555 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-c46db" event={"ID":"3c376c7e-d8b9-4f7d-943b-ed3300a60c2d","Type":"ContainerDied","Data":"e798d2a382877b91816de6e6f135c81a9d6758ac82cd513672958c3b2213b478"} Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.750577 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e798d2a382877b91816de6e6f135c81a9d6758ac82cd513672958c3b2213b478" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.750611 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-c46db" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.753468 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b89c-account-create-update-5qdcl" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.753947 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b89c-account-create-update-5qdcl" event={"ID":"7425c462-167a-4f21-9eee-afb9b7b1767e","Type":"ContainerDied","Data":"899e01e9806aed181945975240a5d309b35883ef1160924b8def1dbcfa15cc00"} Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.754036 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="899e01e9806aed181945975240a5d309b35883ef1160924b8def1dbcfa15cc00" Dec 02 08:04:08 crc kubenswrapper[4691]: I1202 08:04:08.975735 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.155673 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.168614 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97cw9\" (UniqueName: \"kubernetes.io/projected/92913a64-59ad-47ca-ad56-a2ad01fbc281-kube-api-access-97cw9\") pod \"92913a64-59ad-47ca-ad56-a2ad01fbc281\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.168898 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92913a64-59ad-47ca-ad56-a2ad01fbc281-operator-scripts\") pod \"92913a64-59ad-47ca-ad56-a2ad01fbc281\" (UID: \"92913a64-59ad-47ca-ad56-a2ad01fbc281\") " Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.169812 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/92913a64-59ad-47ca-ad56-a2ad01fbc281-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "92913a64-59ad-47ca-ad56-a2ad01fbc281" (UID: "92913a64-59ad-47ca-ad56-a2ad01fbc281"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.173024 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92913a64-59ad-47ca-ad56-a2ad01fbc281-kube-api-access-97cw9" (OuterVolumeSpecName: "kube-api-access-97cw9") pod "92913a64-59ad-47ca-ad56-a2ad01fbc281" (UID: "92913a64-59ad-47ca-ad56-a2ad01fbc281"). InnerVolumeSpecName "kube-api-access-97cw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.270096 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csm4s\" (UniqueName: \"kubernetes.io/projected/b465b231-332d-4086-a3eb-dcf8f02278b8-kube-api-access-csm4s\") pod \"b465b231-332d-4086-a3eb-dcf8f02278b8\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.270335 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b465b231-332d-4086-a3eb-dcf8f02278b8-operator-scripts\") pod \"b465b231-332d-4086-a3eb-dcf8f02278b8\" (UID: \"b465b231-332d-4086-a3eb-dcf8f02278b8\") " Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.271071 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97cw9\" (UniqueName: \"kubernetes.io/projected/92913a64-59ad-47ca-ad56-a2ad01fbc281-kube-api-access-97cw9\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.271094 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/92913a64-59ad-47ca-ad56-a2ad01fbc281-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.271239 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b465b231-332d-4086-a3eb-dcf8f02278b8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b465b231-332d-4086-a3eb-dcf8f02278b8" (UID: "b465b231-332d-4086-a3eb-dcf8f02278b8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.274846 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b465b231-332d-4086-a3eb-dcf8f02278b8-kube-api-access-csm4s" (OuterVolumeSpecName: "kube-api-access-csm4s") pod "b465b231-332d-4086-a3eb-dcf8f02278b8" (UID: "b465b231-332d-4086-a3eb-dcf8f02278b8"). InnerVolumeSpecName "kube-api-access-csm4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.373345 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csm4s\" (UniqueName: \"kubernetes.io/projected/b465b231-332d-4086-a3eb-dcf8f02278b8-kube-api-access-csm4s\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.373409 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b465b231-332d-4086-a3eb-dcf8f02278b8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.779667 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76a8-account-create-update-bmt2m" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.779701 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76a8-account-create-update-bmt2m" event={"ID":"b465b231-332d-4086-a3eb-dcf8f02278b8","Type":"ContainerDied","Data":"676067f720dbf1bf659a7bd450f7c285f76fa4c6dfb7e494747c06bcfdf7a241"} Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.781000 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="676067f720dbf1bf659a7bd450f7c285f76fa4c6dfb7e494747c06bcfdf7a241" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.791309 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-lwfnn" event={"ID":"92913a64-59ad-47ca-ad56-a2ad01fbc281","Type":"ContainerDied","Data":"c3adb38fb2e67b08cb32f09bb946bcbd3a47a9e8287ccd9e5401861f183616c1"} Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.791384 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3adb38fb2e67b08cb32f09bb946bcbd3a47a9e8287ccd9e5401861f183616c1" Dec 02 08:04:09 crc kubenswrapper[4691]: I1202 08:04:09.791451 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-lwfnn" Dec 02 08:04:12 crc kubenswrapper[4691]: I1202 08:04:12.176147 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:12 crc kubenswrapper[4691]: I1202 08:04:12.408230 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mvp4b"] Dec 02 08:04:12 crc kubenswrapper[4691]: I1202 08:04:12.408940 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-mvp4b" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerName="dnsmasq-dns" containerID="cri-o://bd76f007863257f002eb243891e20aa281636a5729ebd092c94791d8128966b4" gracePeriod=10 Dec 02 08:04:12 crc kubenswrapper[4691]: I1202 08:04:12.824713 4691 generic.go:334] "Generic (PLEG): container finished" podID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerID="bd76f007863257f002eb243891e20aa281636a5729ebd092c94791d8128966b4" exitCode=0 Dec 02 08:04:12 crc kubenswrapper[4691]: I1202 08:04:12.824816 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mvp4b" event={"ID":"d5f602b9-3363-49e4-b0c0-b0c520b5feaf","Type":"ContainerDied","Data":"bd76f007863257f002eb243891e20aa281636a5729ebd092c94791d8128966b4"} Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.426051 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.617491 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-config\") pod \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.617537 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-dns-svc\") pod \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.617614 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nttlf\" (UniqueName: \"kubernetes.io/projected/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-kube-api-access-nttlf\") pod \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.617660 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-nb\") pod \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.619470 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-sb\") pod \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\" (UID: \"d5f602b9-3363-49e4-b0c0-b0c520b5feaf\") " Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.623045 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-kube-api-access-nttlf" (OuterVolumeSpecName: "kube-api-access-nttlf") pod "d5f602b9-3363-49e4-b0c0-b0c520b5feaf" (UID: "d5f602b9-3363-49e4-b0c0-b0c520b5feaf"). InnerVolumeSpecName "kube-api-access-nttlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.662935 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-config" (OuterVolumeSpecName: "config") pod "d5f602b9-3363-49e4-b0c0-b0c520b5feaf" (UID: "d5f602b9-3363-49e4-b0c0-b0c520b5feaf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.663620 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d5f602b9-3363-49e4-b0c0-b0c520b5feaf" (UID: "d5f602b9-3363-49e4-b0c0-b0c520b5feaf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.664174 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d5f602b9-3363-49e4-b0c0-b0c520b5feaf" (UID: "d5f602b9-3363-49e4-b0c0-b0c520b5feaf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.679097 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d5f602b9-3363-49e4-b0c0-b0c520b5feaf" (UID: "d5f602b9-3363-49e4-b0c0-b0c520b5feaf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.724831 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.724863 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.724958 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.725314 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nttlf\" (UniqueName: \"kubernetes.io/projected/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-kube-api-access-nttlf\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.725325 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d5f602b9-3363-49e4-b0c0-b0c520b5feaf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.848077 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-mvp4b" event={"ID":"d5f602b9-3363-49e4-b0c0-b0c520b5feaf","Type":"ContainerDied","Data":"316af0189478220fc980d86c0262bf1ae73630f9fc3a9284b680f3780ef0be02"} Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.848144 4691 scope.go:117] "RemoveContainer" containerID="bd76f007863257f002eb243891e20aa281636a5729ebd092c94791d8128966b4" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.848314 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-mvp4b" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.859037 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nxz46" event={"ID":"c9e8994b-8c3a-4086-babd-3e79a67aae9b","Type":"ContainerStarted","Data":"34a11708d751994a7640e1c70737b8672f0866a7d5df6ef924491cc82aa9d617"} Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.882775 4691 scope.go:117] "RemoveContainer" containerID="8e9f05105658aca9613058a5fe7c5d10f9a7f47e168d586e156636a40650c54d" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.884800 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-nxz46" podStartSLOduration=3.139211433 podStartE2EDuration="9.884788923s" podCreationTimestamp="2025-12-02 08:04:04 +0000 UTC" firstStartedPulling="2025-12-02 08:04:06.242899115 +0000 UTC m=+1094.026977977" lastFinishedPulling="2025-12-02 08:04:12.988476605 +0000 UTC m=+1100.772555467" observedRunningTime="2025-12-02 08:04:13.880785822 +0000 UTC m=+1101.664864694" watchObservedRunningTime="2025-12-02 08:04:13.884788923 +0000 UTC m=+1101.668867775" Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.918646 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mvp4b"] Dec 02 08:04:13 crc kubenswrapper[4691]: I1202 08:04:13.926551 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-mvp4b"] Dec 02 08:04:14 crc kubenswrapper[4691]: I1202 08:04:14.881626 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" path="/var/lib/kubelet/pods/d5f602b9-3363-49e4-b0c0-b0c520b5feaf/volumes" Dec 02 08:04:17 crc kubenswrapper[4691]: I1202 08:04:17.907698 4691 generic.go:334] "Generic (PLEG): container finished" podID="c9e8994b-8c3a-4086-babd-3e79a67aae9b" containerID="34a11708d751994a7640e1c70737b8672f0866a7d5df6ef924491cc82aa9d617" exitCode=0 Dec 02 08:04:17 crc kubenswrapper[4691]: I1202 08:04:17.907805 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nxz46" event={"ID":"c9e8994b-8c3a-4086-babd-3e79a67aae9b","Type":"ContainerDied","Data":"34a11708d751994a7640e1c70737b8672f0866a7d5df6ef924491cc82aa9d617"} Dec 02 08:04:18 crc kubenswrapper[4691]: I1202 08:04:18.927157 4691 generic.go:334] "Generic (PLEG): container finished" podID="4e433058-a34d-4156-9a25-07a573d1c4d2" containerID="77f128d5bae6769cd4c4a22a70e7c45068adcec9139a6fa01b26443bbacd4909" exitCode=0 Dec 02 08:04:18 crc kubenswrapper[4691]: I1202 08:04:18.927282 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxcgj" event={"ID":"4e433058-a34d-4156-9a25-07a573d1c4d2","Type":"ContainerDied","Data":"77f128d5bae6769cd4c4a22a70e7c45068adcec9139a6fa01b26443bbacd4909"} Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.229235 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.366312 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-combined-ca-bundle\") pod \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.366528 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjbd4\" (UniqueName: \"kubernetes.io/projected/c9e8994b-8c3a-4086-babd-3e79a67aae9b-kube-api-access-xjbd4\") pod \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.366621 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-config-data\") pod \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\" (UID: \"c9e8994b-8c3a-4086-babd-3e79a67aae9b\") " Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.375555 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e8994b-8c3a-4086-babd-3e79a67aae9b-kube-api-access-xjbd4" (OuterVolumeSpecName: "kube-api-access-xjbd4") pod "c9e8994b-8c3a-4086-babd-3e79a67aae9b" (UID: "c9e8994b-8c3a-4086-babd-3e79a67aae9b"). InnerVolumeSpecName "kube-api-access-xjbd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.397305 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9e8994b-8c3a-4086-babd-3e79a67aae9b" (UID: "c9e8994b-8c3a-4086-babd-3e79a67aae9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.422042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-config-data" (OuterVolumeSpecName: "config-data") pod "c9e8994b-8c3a-4086-babd-3e79a67aae9b" (UID: "c9e8994b-8c3a-4086-babd-3e79a67aae9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.469790 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjbd4\" (UniqueName: \"kubernetes.io/projected/c9e8994b-8c3a-4086-babd-3e79a67aae9b-kube-api-access-xjbd4\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.469866 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.469882 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9e8994b-8c3a-4086-babd-3e79a67aae9b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.939163 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-nxz46" Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.939403 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-nxz46" event={"ID":"c9e8994b-8c3a-4086-babd-3e79a67aae9b","Type":"ContainerDied","Data":"b6c86fbedbcf5e94ba657734a21476abe97e3caf9c81f3f96065ed1049f527f4"} Dec 02 08:04:19 crc kubenswrapper[4691]: I1202 08:04:19.939559 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6c86fbedbcf5e94ba657734a21476abe97e3caf9c81f3f96065ed1049f527f4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.192728 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-qlrg7"] Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.193119 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.193134 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.193156 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e8994b-8c3a-4086-babd-3e79a67aae9b" containerName="keystone-db-sync" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.193162 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e8994b-8c3a-4086-babd-3e79a67aae9b" containerName="keystone-db-sync" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.193175 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7425c462-167a-4f21-9eee-afb9b7b1767e" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.193187 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7425c462-167a-4f21-9eee-afb9b7b1767e" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.193198 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerName="dnsmasq-dns" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.193205 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerName="dnsmasq-dns" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.198232 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b465b231-332d-4086-a3eb-dcf8f02278b8" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198280 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b465b231-332d-4086-a3eb-dcf8f02278b8" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.198335 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92913a64-59ad-47ca-ad56-a2ad01fbc281" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198344 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="92913a64-59ad-47ca-ad56-a2ad01fbc281" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.198357 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62ec2865-a83a-43c0-a786-e5a22b7a7008" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198363 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="62ec2865-a83a-43c0-a786-e5a22b7a7008" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.198374 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad884439-9cac-4888-902a-81c30359a9b9" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198380 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad884439-9cac-4888-902a-81c30359a9b9" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.198392 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerName="init" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198398 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerName="init" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198655 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5f602b9-3363-49e4-b0c0-b0c520b5feaf" containerName="dnsmasq-dns" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198673 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e8994b-8c3a-4086-babd-3e79a67aae9b" containerName="keystone-db-sync" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198683 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad884439-9cac-4888-902a-81c30359a9b9" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198695 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7425c462-167a-4f21-9eee-afb9b7b1767e" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198707 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="92913a64-59ad-47ca-ad56-a2ad01fbc281" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198714 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="62ec2865-a83a-43c0-a786-e5a22b7a7008" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198723 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b465b231-332d-4086-a3eb-dcf8f02278b8" containerName="mariadb-account-create-update" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.198735 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" containerName="mariadb-database-create" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.206293 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.217581 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-qlrg7"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.280841 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-jr4fw"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.282099 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.293107 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.293301 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.293411 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lmmt4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.293527 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.294607 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jr4fw"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.297896 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.392875 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-config\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.392931 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-svc\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.392957 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-combined-ca-bundle\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.392991 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-credential-keys\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393011 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-scripts\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393034 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-config-data\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393050 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393081 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xm24\" (UniqueName: \"kubernetes.io/projected/2be077f9-2259-48e5-818d-63804f9951ac-kube-api-access-9xm24\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393214 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-fernet-keys\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393237 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.393261 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hhhc\" (UniqueName: \"kubernetes.io/projected/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-kube-api-access-4hhhc\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.491132 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxcgj" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498673 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498731 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hhhc\" (UniqueName: \"kubernetes.io/projected/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-kube-api-access-4hhhc\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498772 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-config\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498805 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-svc\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498821 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-combined-ca-bundle\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498851 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-credential-keys\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498866 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-scripts\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498891 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-config-data\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498930 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.498966 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xm24\" (UniqueName: \"kubernetes.io/projected/2be077f9-2259-48e5-818d-63804f9951ac-kube-api-access-9xm24\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.499023 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-fernet-keys\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.500405 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-swift-storage-0\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.501300 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-svc\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.502349 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-config\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.507721 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-sb\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.508471 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-nb\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.509596 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-config-data\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.515517 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-credential-keys\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.521175 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-scripts\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.544396 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-fernet-keys\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.547139 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hhhc\" (UniqueName: \"kubernetes.io/projected/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-kube-api-access-4hhhc\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.548497 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-combined-ca-bundle\") pod \"keystone-bootstrap-jr4fw\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.555834 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-4tvs4"] Dec 02 08:04:20 crc kubenswrapper[4691]: E1202 08:04:20.556337 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e433058-a34d-4156-9a25-07a573d1c4d2" containerName="glance-db-sync" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.556355 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e433058-a34d-4156-9a25-07a573d1c4d2" containerName="glance-db-sync" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.556538 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e433058-a34d-4156-9a25-07a573d1c4d2" containerName="glance-db-sync" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.557187 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.558586 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xm24\" (UniqueName: \"kubernetes.io/projected/2be077f9-2259-48e5-818d-63804f9951ac-kube-api-access-9xm24\") pod \"dnsmasq-dns-5959f8865f-qlrg7\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.567472 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nkqg6" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.567825 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.574936 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.602642 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-config-data\") pod \"4e433058-a34d-4156-9a25-07a573d1c4d2\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.602726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-db-sync-config-data\") pod \"4e433058-a34d-4156-9a25-07a573d1c4d2\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.602790 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pmkk\" (UniqueName: \"kubernetes.io/projected/4e433058-a34d-4156-9a25-07a573d1c4d2-kube-api-access-8pmkk\") pod \"4e433058-a34d-4156-9a25-07a573d1c4d2\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.612590 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-6qz2r"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.613627 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-977b585d5-q75dz"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.624235 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-combined-ca-bundle\") pod \"4e433058-a34d-4156-9a25-07a573d1c4d2\" (UID: \"4e433058-a34d-4156-9a25-07a573d1c4d2\") " Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.625040 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt7bf\" (UniqueName: \"kubernetes.io/projected/d7c46467-e60b-47fd-be7b-660d674b6504-kube-api-access-wt7bf\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.625113 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-combined-ca-bundle\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.625186 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-config\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.625664 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.647203 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.649473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e433058-a34d-4156-9a25-07a573d1c4d2-kube-api-access-8pmkk" (OuterVolumeSpecName: "kube-api-access-8pmkk") pod "4e433058-a34d-4156-9a25-07a573d1c4d2" (UID: "4e433058-a34d-4156-9a25-07a573d1c4d2"). InnerVolumeSpecName "kube-api-access-8pmkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.653884 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6qz2r"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.654340 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.662924 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4e433058-a34d-4156-9a25-07a573d1c4d2" (UID: "4e433058-a34d-4156-9a25-07a573d1c4d2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.670134 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fxlkj" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.670367 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.670904 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.673383 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.673669 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.674279 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.677795 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4tvs4"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.702779 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-mf4bn" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.720617 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-config-data" (OuterVolumeSpecName: "config-data") pod "4e433058-a34d-4156-9a25-07a573d1c4d2" (UID: "4e433058-a34d-4156-9a25-07a573d1c4d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.741170 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt7bf\" (UniqueName: \"kubernetes.io/projected/d7c46467-e60b-47fd-be7b-660d674b6504-kube-api-access-wt7bf\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.741284 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-combined-ca-bundle\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.741495 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-config\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.744555 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.745088 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.745145 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pmkk\" (UniqueName: \"kubernetes.io/projected/4e433058-a34d-4156-9a25-07a573d1c4d2-kube-api-access-8pmkk\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.792560 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-combined-ca-bundle\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.798425 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt7bf\" (UniqueName: \"kubernetes.io/projected/d7c46467-e60b-47fd-be7b-660d674b6504-kube-api-access-wt7bf\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.801385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-config\") pod \"neutron-db-sync-4tvs4\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.807805 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-977b585d5-q75dz"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.814251 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4e433058-a34d-4156-9a25-07a573d1c4d2" (UID: "4e433058-a34d-4156-9a25-07a573d1c4d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.844282 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847287 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-config-data\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847344 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-db-sync-config-data\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847372 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-scripts\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847394 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-combined-ca-bundle\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847434 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-config-data\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847461 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms6tl\" (UniqueName: \"kubernetes.io/projected/c651616e-857f-4aae-a76b-79bf365695a3-kube-api-access-ms6tl\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847536 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a217e1fe-be30-4247-91f3-020aaa089689-etc-machine-id\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651616e-857f-4aae-a76b-79bf365695a3-logs\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847609 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c651616e-857f-4aae-a76b-79bf365695a3-horizon-secret-key\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847647 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s5hb\" (UniqueName: \"kubernetes.io/projected/a217e1fe-be30-4247-91f3-020aaa089689-kube-api-access-6s5hb\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847671 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-scripts\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.847727 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e433058-a34d-4156-9a25-07a573d1c4d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.912351 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-7sh8f"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.913540 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.927989 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.942948 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-skzl9" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.943068 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7sh8f"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.948816 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-config-data\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.948872 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms6tl\" (UniqueName: \"kubernetes.io/projected/c651616e-857f-4aae-a76b-79bf365695a3-kube-api-access-ms6tl\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.948944 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a217e1fe-be30-4247-91f3-020aaa089689-etc-machine-id\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949017 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651616e-857f-4aae-a76b-79bf365695a3-logs\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949051 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c651616e-857f-4aae-a76b-79bf365695a3-horizon-secret-key\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949100 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s5hb\" (UniqueName: \"kubernetes.io/projected/a217e1fe-be30-4247-91f3-020aaa089689-kube-api-access-6s5hb\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949130 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-scripts\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949155 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-config-data\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949189 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-db-sync-config-data\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949219 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-scripts\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.949240 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-combined-ca-bundle\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.951152 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a217e1fe-be30-4247-91f3-020aaa089689-etc-machine-id\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.952550 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651616e-857f-4aae-a76b-79bf365695a3-logs\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.953548 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-config-data\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.954665 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-scripts\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.977830 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c651616e-857f-4aae-a76b-79bf365695a3-horizon-secret-key\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.978417 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2zwm7"] Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.979367 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-db-sync-config-data\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.980790 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:20 crc kubenswrapper[4691]: I1202 08:04:20.986795 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-scripts\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.000294 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.000858 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.001059 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9fpvx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.021496 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.027904 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-config-data\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.034135 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s5hb\" (UniqueName: \"kubernetes.io/projected/a217e1fe-be30-4247-91f3-020aaa089689-kube-api-access-6s5hb\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050637 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56f22\" (UniqueName: \"kubernetes.io/projected/570d8c0d-670c-4132-85e4-e13633c3bcc2-kube-api-access-56f22\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050694 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-config-data\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050747 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-combined-ca-bundle\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050818 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6xrr\" (UniqueName: \"kubernetes.io/projected/7f939f3c-07b4-42b8-94d9-3dbd15c03287-kube-api-access-b6xrr\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050880 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-db-sync-config-data\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050903 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-scripts\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050935 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570d8c0d-670c-4132-85e4-e13633c3bcc2-logs\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.050954 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-combined-ca-bundle\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.076512 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxcgj" event={"ID":"4e433058-a34d-4156-9a25-07a573d1c4d2","Type":"ContainerDied","Data":"6e679b51c118e6dee47f03cea7ed159e5dd4044bc688c1c59577e13753a8ede1"} Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.076565 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e679b51c118e6dee47f03cea7ed159e5dd4044bc688c1c59577e13753a8ede1" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.076659 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxcgj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.083169 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms6tl\" (UniqueName: \"kubernetes.io/projected/c651616e-857f-4aae-a76b-79bf365695a3-kube-api-access-ms6tl\") pod \"horizon-977b585d5-q75dz\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.086441 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-combined-ca-bundle\") pod \"cinder-db-sync-6qz2r\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.115808 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-qlrg7"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.149693 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2zwm7"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158034 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-db-sync-config-data\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158082 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-scripts\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158102 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570d8c0d-670c-4132-85e4-e13633c3bcc2-logs\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158123 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-combined-ca-bundle\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158194 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56f22\" (UniqueName: \"kubernetes.io/projected/570d8c0d-670c-4132-85e4-e13633c3bcc2-kube-api-access-56f22\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158214 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-config-data\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158247 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-combined-ca-bundle\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.158287 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6xrr\" (UniqueName: \"kubernetes.io/projected/7f939f3c-07b4-42b8-94d9-3dbd15c03287-kube-api-access-b6xrr\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.159276 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570d8c0d-670c-4132-85e4-e13633c3bcc2-logs\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.175653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-combined-ca-bundle\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.176871 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-combined-ca-bundle\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.179457 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-config-data\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.184203 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-scripts\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.191354 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-db-sync-config-data\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.192340 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-584b486865-hjdcx"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.201487 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.199704 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6xrr\" (UniqueName: \"kubernetes.io/projected/7f939f3c-07b4-42b8-94d9-3dbd15c03287-kube-api-access-b6xrr\") pod \"barbican-db-sync-7sh8f\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.211546 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56f22\" (UniqueName: \"kubernetes.io/projected/570d8c0d-670c-4132-85e4-e13633c3bcc2-kube-api-access-56f22\") pod \"placement-db-sync-2zwm7\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.245832 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-xw9k9"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.247622 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.258745 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-584b486865-hjdcx"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.269166 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/03a43183-f7c0-456d-8843-d94b1e97e51e-horizon-secret-key\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.269237 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03a43183-f7c0-456d-8843-d94b1e97e51e-logs\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.269276 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-config-data\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.269296 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wcrm\" (UniqueName: \"kubernetes.io/projected/03a43183-f7c0-456d-8843-d94b1e97e51e-kube-api-access-2wcrm\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.269324 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-scripts\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.304362 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-xw9k9"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.305157 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.349397 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2zwm7" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372110 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372225 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-config-data\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372262 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wcrm\" (UniqueName: \"kubernetes.io/projected/03a43183-f7c0-456d-8843-d94b1e97e51e-kube-api-access-2wcrm\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372322 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-scripts\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-config\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372459 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372532 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372588 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/03a43183-f7c0-456d-8843-d94b1e97e51e-horizon-secret-key\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlvwm\" (UniqueName: \"kubernetes.io/projected/d177ac4d-6f73-464e-8c82-a4d525328bf0-kube-api-access-dlvwm\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.372805 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03a43183-f7c0-456d-8843-d94b1e97e51e-logs\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.373496 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-scripts\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.375154 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.375215 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-config-data\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.375752 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.376280 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03a43183-f7c0-456d-8843-d94b1e97e51e-logs\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.413478 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/03a43183-f7c0-456d-8843-d94b1e97e51e-horizon-secret-key\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.413544 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.414034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wcrm\" (UniqueName: \"kubernetes.io/projected/03a43183-f7c0-456d-8843-d94b1e97e51e-kube-api-access-2wcrm\") pod \"horizon-584b486865-hjdcx\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.427429 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.428579 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.431275 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.431465 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.474206 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-config\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.474256 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.474310 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.474344 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.474426 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlvwm\" (UniqueName: \"kubernetes.io/projected/d177ac4d-6f73-464e-8c82-a4d525328bf0-kube-api-access-dlvwm\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.474472 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.475259 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-config\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.479370 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-nb\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.489342 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-swift-storage-0\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.494648 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-svc\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.499164 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-sb\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.532340 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlvwm\" (UniqueName: \"kubernetes.io/projected/d177ac4d-6f73-464e-8c82-a4d525328bf0-kube-api-access-dlvwm\") pod \"dnsmasq-dns-58dd9ff6bc-xw9k9\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.533583 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.565246 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-jr4fw"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577425 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-scripts\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577488 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2qd7\" (UniqueName: \"kubernetes.io/projected/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-kube-api-access-x2qd7\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577589 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-log-httpd\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577627 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577647 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-run-httpd\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577666 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.577685 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-config-data\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.591526 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.622993 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-xw9k9"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691162 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-log-httpd\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691248 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691280 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-run-httpd\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691310 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691336 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-config-data\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691402 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-scripts\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.691467 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2qd7\" (UniqueName: \"kubernetes.io/projected/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-kube-api-access-x2qd7\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.703864 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-log-httpd\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.707612 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p84rj"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.709267 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.713537 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-config-data\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.715968 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-scripts\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.726234 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p84rj"] Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.728989 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-run-httpd\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.737047 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.738049 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.773275 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2qd7\" (UniqueName: \"kubernetes.io/projected/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-kube-api-access-x2qd7\") pod \"ceilometer-0\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " pod="openstack/ceilometer-0" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.792902 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.792960 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-config\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.793086 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.793135 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.793197 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.793222 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl4vk\" (UniqueName: \"kubernetes.io/projected/8956bae5-b6fb-496d-95b1-775e634fb54b-kube-api-access-jl4vk\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.894816 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.894871 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.894917 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.894937 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jl4vk\" (UniqueName: \"kubernetes.io/projected/8956bae5-b6fb-496d-95b1-775e634fb54b-kube-api-access-jl4vk\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.894988 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.895013 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-config\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.896149 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-config\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.896873 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.897463 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.898135 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.899048 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.918844 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl4vk\" (UniqueName: \"kubernetes.io/projected/8956bae5-b6fb-496d-95b1-775e634fb54b-kube-api-access-jl4vk\") pod \"dnsmasq-dns-785d8bcb8c-p84rj\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:21 crc kubenswrapper[4691]: I1202 08:04:21.937935 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-qlrg7"] Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.015488 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.046417 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-4tvs4"] Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.056528 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.141349 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr4fw" event={"ID":"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f","Type":"ContainerStarted","Data":"745535875ba7fe9dcd1be2d4642caf28fbb95705efb80966bc17ae37139205e3"} Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.145878 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" event={"ID":"2be077f9-2259-48e5-818d-63804f9951ac","Type":"ContainerStarted","Data":"2865fe2c347967982c16b668032d211b864aa0615162e1dacc3cf6867188051b"} Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.180597 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2zwm7"] Dec 02 08:04:22 crc kubenswrapper[4691]: W1202 08:04:22.362357 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod570d8c0d_670c_4132_85e4_e13633c3bcc2.slice/crio-93670a343c1d8fc9d43a193c025e3fa34792c131499b814e794c2cea0c425e52 WatchSource:0}: Error finding container 93670a343c1d8fc9d43a193c025e3fa34792c131499b814e794c2cea0c425e52: Status 404 returned error can't find the container with id 93670a343c1d8fc9d43a193c025e3fa34792c131499b814e794c2cea0c425e52 Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.437791 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.439893 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.442989 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.443185 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.443310 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5ngw7" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.456808 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.519156 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.519211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.519272 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-logs\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.519303 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.522013 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.522092 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8cjc\" (UniqueName: \"kubernetes.io/projected/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-kube-api-access-w8cjc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.522116 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646074 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-logs\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646449 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646615 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646659 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8cjc\" (UniqueName: \"kubernetes.io/projected/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-kube-api-access-w8cjc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646679 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646829 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.646896 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.647291 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.655472 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-logs\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.655558 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.673342 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.677034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-scripts\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.679656 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8cjc\" (UniqueName: \"kubernetes.io/projected/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-kube-api-access-w8cjc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.682734 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.683034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-config-data\") pod \"glance-default-external-api-0\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.842014 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.955912 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-7sh8f"] Dec 02 08:04:22 crc kubenswrapper[4691]: I1202 08:04:22.974985 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-977b585d5-q75dz"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.052828 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-6qz2r"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.066583 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.068414 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.070606 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.081426 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-584b486865-hjdcx"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.090320 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.138961 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.153956 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-xw9k9"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.158916 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.158966 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.159010 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.159039 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.159088 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6hqg\" (UniqueName: \"kubernetes.io/projected/8220218e-209d-4528-a3ba-f8c057eb9740-kube-api-access-s6hqg\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.159127 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-logs\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.159155 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.177089 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qz2r" event={"ID":"a217e1fe-be30-4247-91f3-020aaa089689","Type":"ContainerStarted","Data":"9a40aacf676cc0605aa54c512cbfa611437da37fba3978e49712ae02b734e29c"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.177951 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2zwm7" event={"ID":"570d8c0d-670c-4132-85e4-e13633c3bcc2","Type":"ContainerStarted","Data":"93670a343c1d8fc9d43a193c025e3fa34792c131499b814e794c2cea0c425e52"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.183588 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sh8f" event={"ID":"7f939f3c-07b4-42b8-94d9-3dbd15c03287","Type":"ContainerStarted","Data":"0fc87a782fb450edea5abb46c422787bbcf66518b0ba5a31e6ac0da512349fb5"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.188640 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584b486865-hjdcx" event={"ID":"03a43183-f7c0-456d-8843-d94b1e97e51e","Type":"ContainerStarted","Data":"a49e8a1d540d83969f4ca6bec849f0957c3b4129ac1250acc0fab1420c24189d"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.197094 4691 generic.go:334] "Generic (PLEG): container finished" podID="2be077f9-2259-48e5-818d-63804f9951ac" containerID="a13347e91b2c358f88ddcbe4ed05e22a51218beb5b260889bb39053ee6c001e2" exitCode=0 Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.197147 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" event={"ID":"2be077f9-2259-48e5-818d-63804f9951ac","Type":"ContainerDied","Data":"a13347e91b2c358f88ddcbe4ed05e22a51218beb5b260889bb39053ee6c001e2"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.204729 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-977b585d5-q75dz" event={"ID":"c651616e-857f-4aae-a76b-79bf365695a3","Type":"ContainerStarted","Data":"51ff8f57b1aba70137d34831d35183108644c5e0b4758168c713244494e65377"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.237488 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4tvs4" event={"ID":"d7c46467-e60b-47fd-be7b-660d674b6504","Type":"ContainerStarted","Data":"7f40968c6bdee9e8fb71c6ec0ba22bcb3f072ed076a6b75797013af2e2e45f94"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.243274 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4tvs4" event={"ID":"d7c46467-e60b-47fd-be7b-660d674b6504","Type":"ContainerStarted","Data":"72b309b7af526586fb0c23c4468986a39a90292fd0a9345291847285ac400f07"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.260203 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerStarted","Data":"4ada966b7afba1b3f5e9530b999abd22ea7a850046b315324901727f54ed8195"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261317 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261414 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6hqg\" (UniqueName: \"kubernetes.io/projected/8220218e-209d-4528-a3ba-f8c057eb9740-kube-api-access-s6hqg\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261472 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-logs\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261503 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261598 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261638 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.261687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.262081 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.270516 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-logs\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.271378 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.284495 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.292218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6hqg\" (UniqueName: \"kubernetes.io/projected/8220218e-209d-4528-a3ba-f8c057eb9740-kube-api-access-s6hqg\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.294385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.294679 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.295285 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr4fw" event={"ID":"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f","Type":"ContainerStarted","Data":"4cbb75644dab97ed0e058501793455d3498cbeb17704bdf1e3d4fe2ce8895111"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.330851 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" event={"ID":"d177ac4d-6f73-464e-8c82-a4d525328bf0","Type":"ContainerStarted","Data":"4008a512dfbb6afb0f172b8cd0d0415591ad6b5dccd2e34c473c145d6cad423c"} Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.347178 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p84rj"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.356223 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-4tvs4" podStartSLOduration=3.356200463 podStartE2EDuration="3.356200463s" podCreationTimestamp="2025-12-02 08:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:23.259056865 +0000 UTC m=+1111.043135727" watchObservedRunningTime="2025-12-02 08:04:23.356200463 +0000 UTC m=+1111.140279325" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.383261 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-jr4fw" podStartSLOduration=3.383242668 podStartE2EDuration="3.383242668s" podCreationTimestamp="2025-12-02 08:04:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:23.330737187 +0000 UTC m=+1111.114816049" watchObservedRunningTime="2025-12-02 08:04:23.383242668 +0000 UTC m=+1111.167321530" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.417086 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.446303 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.469505 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.489824 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-584b486865-hjdcx"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.563238 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7c4b8f5f8f-l7b75"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.569516 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.616952 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c4b8f5f8f-l7b75"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.821774 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-scripts\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.821821 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24910bce-2ac5-4966-af8e-48dad2b11370-horizon-secret-key\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.821919 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wbpv\" (UniqueName: \"kubernetes.io/projected/24910bce-2ac5-4966-af8e-48dad2b11370-kube-api-access-8wbpv\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.821942 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-config-data\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.821991 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24910bce-2ac5-4966-af8e-48dad2b11370-logs\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.861612 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.926409 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wbpv\" (UniqueName: \"kubernetes.io/projected/24910bce-2ac5-4966-af8e-48dad2b11370-kube-api-access-8wbpv\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.926454 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-config-data\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.926496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24910bce-2ac5-4966-af8e-48dad2b11370-logs\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.926582 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-scripts\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.926604 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24910bce-2ac5-4966-af8e-48dad2b11370-horizon-secret-key\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.929469 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-config-data\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.930316 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24910bce-2ac5-4966-af8e-48dad2b11370-logs\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.934384 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-scripts\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.941299 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24910bce-2ac5-4966-af8e-48dad2b11370-horizon-secret-key\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:23 crc kubenswrapper[4691]: I1202 08:04:23.957881 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wbpv\" (UniqueName: \"kubernetes.io/projected/24910bce-2ac5-4966-af8e-48dad2b11370-kube-api-access-8wbpv\") pod \"horizon-7c4b8f5f8f-l7b75\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.015926 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.028373 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.048848 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.129861 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-config\") pod \"2be077f9-2259-48e5-818d-63804f9951ac\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.129955 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xm24\" (UniqueName: \"kubernetes.io/projected/2be077f9-2259-48e5-818d-63804f9951ac-kube-api-access-9xm24\") pod \"2be077f9-2259-48e5-818d-63804f9951ac\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.130240 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-svc\") pod \"2be077f9-2259-48e5-818d-63804f9951ac\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.130329 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-swift-storage-0\") pod \"2be077f9-2259-48e5-818d-63804f9951ac\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.130359 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-nb\") pod \"2be077f9-2259-48e5-818d-63804f9951ac\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.130409 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-sb\") pod \"2be077f9-2259-48e5-818d-63804f9951ac\" (UID: \"2be077f9-2259-48e5-818d-63804f9951ac\") " Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.137132 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2be077f9-2259-48e5-818d-63804f9951ac-kube-api-access-9xm24" (OuterVolumeSpecName: "kube-api-access-9xm24") pod "2be077f9-2259-48e5-818d-63804f9951ac" (UID: "2be077f9-2259-48e5-818d-63804f9951ac"). InnerVolumeSpecName "kube-api-access-9xm24". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.173228 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2be077f9-2259-48e5-818d-63804f9951ac" (UID: "2be077f9-2259-48e5-818d-63804f9951ac"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.196401 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2be077f9-2259-48e5-818d-63804f9951ac" (UID: "2be077f9-2259-48e5-818d-63804f9951ac"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.201177 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2be077f9-2259-48e5-818d-63804f9951ac" (UID: "2be077f9-2259-48e5-818d-63804f9951ac"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.201327 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-config" (OuterVolumeSpecName: "config") pod "2be077f9-2259-48e5-818d-63804f9951ac" (UID: "2be077f9-2259-48e5-818d-63804f9951ac"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.208194 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2be077f9-2259-48e5-818d-63804f9951ac" (UID: "2be077f9-2259-48e5-818d-63804f9951ac"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.233693 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.233733 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xm24\" (UniqueName: \"kubernetes.io/projected/2be077f9-2259-48e5-818d-63804f9951ac-kube-api-access-9xm24\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.233745 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.233754 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.233781 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.233792 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2be077f9-2259-48e5-818d-63804f9951ac-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.254714 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.367817 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" event={"ID":"8956bae5-b6fb-496d-95b1-775e634fb54b","Type":"ContainerStarted","Data":"a9b1cb461f06a2a84c670fe8f966cca2aea12a99cd3301f2ef0ca2163ef7fb25"} Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.373263 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" event={"ID":"2be077f9-2259-48e5-818d-63804f9951ac","Type":"ContainerDied","Data":"2865fe2c347967982c16b668032d211b864aa0615162e1dacc3cf6867188051b"} Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.373349 4691 scope.go:117] "RemoveContainer" containerID="a13347e91b2c358f88ddcbe4ed05e22a51218beb5b260889bb39053ee6c001e2" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.373562 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5959f8865f-qlrg7" Dec 02 08:04:24 crc kubenswrapper[4691]: I1202 08:04:24.395067 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67de45fd-9469-4c5f-aee4-cc4c8bb309c1","Type":"ContainerStarted","Data":"24ac254e05ae2a02e9668be75613993d323725f44a50e08fe0a3e98a8758f54f"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:24.627902 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-qlrg7"] Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:24.645661 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5959f8865f-qlrg7"] Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:24.665708 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:25.506819 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67de45fd-9469-4c5f-aee4-cc4c8bb309c1","Type":"ContainerStarted","Data":"8fcb8d0f1490fc71c88119d16d3f65030b121e7645cab0a7e1803ed573ad9c8c"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:25.515214 4691 generic.go:334] "Generic (PLEG): container finished" podID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerID="5d22a6811808d10f8dba30dc4e421baaa132869cc560106aa1dc2f9f517f8815" exitCode=0 Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:25.515278 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" event={"ID":"8956bae5-b6fb-496d-95b1-775e634fb54b","Type":"ContainerDied","Data":"5d22a6811808d10f8dba30dc4e421baaa132869cc560106aa1dc2f9f517f8815"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:25.518997 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8220218e-209d-4528-a3ba-f8c057eb9740","Type":"ContainerStarted","Data":"3d07bb8f922602ec23963563cc432bbf90d7001450ce7db941f44e3fd8913545"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:25.525965 4691 generic.go:334] "Generic (PLEG): container finished" podID="d177ac4d-6f73-464e-8c82-a4d525328bf0" containerID="f4a4397b72389b846a606b51fc24f9ac24c25abf381a3fd893674d0658b3b48e" exitCode=0 Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:25.526043 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" event={"ID":"d177ac4d-6f73-464e-8c82-a4d525328bf0","Type":"ContainerDied","Data":"f4a4397b72389b846a606b51fc24f9ac24c25abf381a3fd893674d0658b3b48e"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:26.557179 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8220218e-209d-4528-a3ba-f8c057eb9740","Type":"ContainerStarted","Data":"2f6f19966d913104bb9d9b6b3dcb56598b83c55d699e15c3546094db2db6f095"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:26.630231 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2be077f9-2259-48e5-818d-63804f9951ac" path="/var/lib/kubelet/pods/2be077f9-2259-48e5-818d-63804f9951ac/volumes" Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.518515 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-4k8xk" podUID="f75595fc-314d-4aa9-bc60-e82c16361768" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.89:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.621601 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" event={"ID":"8956bae5-b6fb-496d-95b1-775e634fb54b","Type":"ContainerStarted","Data":"b8db8bfef34f507c9cfa0f765ff0612b8805a7355d5e13cec8e5afceba413d61"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.623153 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.627325 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-log" containerID="cri-o://8fcb8d0f1490fc71c88119d16d3f65030b121e7645cab0a7e1803ed573ad9c8c" gracePeriod=30 Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.628497 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-httpd" containerID="cri-o://f8f10f123d04d7d2f17756dd65b88392017669549bde4d2b5d5dad6a47e87b6f" gracePeriod=30 Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.669588 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" podStartSLOduration=6.6695679089999995 podStartE2EDuration="6.669567909s" podCreationTimestamp="2025-12-02 08:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:27.647994058 +0000 UTC m=+1115.432072950" watchObservedRunningTime="2025-12-02 08:04:27.669567909 +0000 UTC m=+1115.453646771" Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:27.681116 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.6811002219999995 podStartE2EDuration="6.681100222s" podCreationTimestamp="2025-12-02 08:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:27.669383354 +0000 UTC m=+1115.453462216" watchObservedRunningTime="2025-12-02 08:04:27.681100222 +0000 UTC m=+1115.465179084" Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.102024 4691 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vgwkd container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: i/o timeout" start-of-body= Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.102082 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vgwkd" podUID="01d8467f-e617-4b68-ada8-440891bb4b51" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: i/o timeout" Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.640571 4691 generic.go:334] "Generic (PLEG): container finished" podID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerID="f8f10f123d04d7d2f17756dd65b88392017669549bde4d2b5d5dad6a47e87b6f" exitCode=143 Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.640965 4691 generic.go:334] "Generic (PLEG): container finished" podID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerID="8fcb8d0f1490fc71c88119d16d3f65030b121e7645cab0a7e1803ed573ad9c8c" exitCode=143 Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.642900 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67de45fd-9469-4c5f-aee4-cc4c8bb309c1","Type":"ContainerDied","Data":"f8f10f123d04d7d2f17756dd65b88392017669549bde4d2b5d5dad6a47e87b6f"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.642941 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67de45fd-9469-4c5f-aee4-cc4c8bb309c1","Type":"ContainerDied","Data":"8fcb8d0f1490fc71c88119d16d3f65030b121e7645cab0a7e1803ed573ad9c8c"} Dec 02 08:04:28 crc kubenswrapper[4691]: I1202 08:04:28.928998 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7c4b8f5f8f-l7b75"] Dec 02 08:04:30 crc kubenswrapper[4691]: I1202 08:04:30.661037 4691 generic.go:334] "Generic (PLEG): container finished" podID="4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" containerID="4cbb75644dab97ed0e058501793455d3498cbeb17704bdf1e3d4fe2ce8895111" exitCode=0 Dec 02 08:04:30 crc kubenswrapper[4691]: I1202 08:04:30.661098 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr4fw" event={"ID":"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f","Type":"ContainerDied","Data":"4cbb75644dab97ed0e058501793455d3498cbeb17704bdf1e3d4fe2ce8895111"} Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.165853 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.190037 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-sb\") pod \"d177ac4d-6f73-464e-8c82-a4d525328bf0\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.190106 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-svc\") pod \"d177ac4d-6f73-464e-8c82-a4d525328bf0\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.190179 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-swift-storage-0\") pod \"d177ac4d-6f73-464e-8c82-a4d525328bf0\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.190283 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-nb\") pod \"d177ac4d-6f73-464e-8c82-a4d525328bf0\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.190319 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlvwm\" (UniqueName: \"kubernetes.io/projected/d177ac4d-6f73-464e-8c82-a4d525328bf0-kube-api-access-dlvwm\") pod \"d177ac4d-6f73-464e-8c82-a4d525328bf0\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.190341 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-config\") pod \"d177ac4d-6f73-464e-8c82-a4d525328bf0\" (UID: \"d177ac4d-6f73-464e-8c82-a4d525328bf0\") " Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.211180 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d177ac4d-6f73-464e-8c82-a4d525328bf0-kube-api-access-dlvwm" (OuterVolumeSpecName: "kube-api-access-dlvwm") pod "d177ac4d-6f73-464e-8c82-a4d525328bf0" (UID: "d177ac4d-6f73-464e-8c82-a4d525328bf0"). InnerVolumeSpecName "kube-api-access-dlvwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.219081 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d177ac4d-6f73-464e-8c82-a4d525328bf0" (UID: "d177ac4d-6f73-464e-8c82-a4d525328bf0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.233258 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d177ac4d-6f73-464e-8c82-a4d525328bf0" (UID: "d177ac4d-6f73-464e-8c82-a4d525328bf0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.235820 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d177ac4d-6f73-464e-8c82-a4d525328bf0" (UID: "d177ac4d-6f73-464e-8c82-a4d525328bf0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.265353 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-config" (OuterVolumeSpecName: "config") pod "d177ac4d-6f73-464e-8c82-a4d525328bf0" (UID: "d177ac4d-6f73-464e-8c82-a4d525328bf0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.281337 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d177ac4d-6f73-464e-8c82-a4d525328bf0" (UID: "d177ac4d-6f73-464e-8c82-a4d525328bf0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.292692 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.292729 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.292739 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.292749 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.292772 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlvwm\" (UniqueName: \"kubernetes.io/projected/d177ac4d-6f73-464e-8c82-a4d525328bf0-kube-api-access-dlvwm\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.292782 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d177ac4d-6f73-464e-8c82-a4d525328bf0-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.706987 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" event={"ID":"d177ac4d-6f73-464e-8c82-a4d525328bf0","Type":"ContainerDied","Data":"4008a512dfbb6afb0f172b8cd0d0415591ad6b5dccd2e34c473c145d6cad423c"} Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.707375 4691 scope.go:117] "RemoveContainer" containerID="f4a4397b72389b846a606b51fc24f9ac24c25abf381a3fd893674d0658b3b48e" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.712278 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dd9ff6bc-xw9k9" Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.792057 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-xw9k9"] Dec 02 08:04:31 crc kubenswrapper[4691]: I1202 08:04:31.801607 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dd9ff6bc-xw9k9"] Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.059653 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.124279 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-29vqc"] Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.124523 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" containerID="cri-o://6e1fec54f20d4f1739f4c040799a33a98a9ef900f8d37ccc9d4b9737485f74e7" gracePeriod=10 Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.173801 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.576041 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d177ac4d-6f73-464e-8c82-a4d525328bf0" path="/var/lib/kubelet/pods/d177ac4d-6f73-464e-8c82-a4d525328bf0/volumes" Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.718709 4691 generic.go:334] "Generic (PLEG): container finished" podID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerID="6e1fec54f20d4f1739f4c040799a33a98a9ef900f8d37ccc9d4b9737485f74e7" exitCode=0 Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.718799 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" event={"ID":"7fc34093-aa4d-4e35-992b-c071d31705d5","Type":"ContainerDied","Data":"6e1fec54f20d4f1739f4c040799a33a98a9ef900f8d37ccc9d4b9737485f74e7"} Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.724401 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8220218e-209d-4528-a3ba-f8c057eb9740","Type":"ContainerStarted","Data":"c8c3d0acf15a96aa47f3fdcd2fdd95fbfeb5d73dd93568d87387ab51b7f50796"} Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.724569 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-log" containerID="cri-o://2f6f19966d913104bb9d9b6b3dcb56598b83c55d699e15c3546094db2db6f095" gracePeriod=30 Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.724678 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-httpd" containerID="cri-o://c8c3d0acf15a96aa47f3fdcd2fdd95fbfeb5d73dd93568d87387ab51b7f50796" gracePeriod=30 Dec 02 08:04:32 crc kubenswrapper[4691]: I1202 08:04:32.749579 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.749558247 podStartE2EDuration="11.749558247s" podCreationTimestamp="2025-12-02 08:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:32.748358318 +0000 UTC m=+1120.532437190" watchObservedRunningTime="2025-12-02 08:04:32.749558247 +0000 UTC m=+1120.533637109" Dec 02 08:04:33 crc kubenswrapper[4691]: I1202 08:04:33.736228 4691 generic.go:334] "Generic (PLEG): container finished" podID="8220218e-209d-4528-a3ba-f8c057eb9740" containerID="c8c3d0acf15a96aa47f3fdcd2fdd95fbfeb5d73dd93568d87387ab51b7f50796" exitCode=0 Dec 02 08:04:33 crc kubenswrapper[4691]: I1202 08:04:33.736580 4691 generic.go:334] "Generic (PLEG): container finished" podID="8220218e-209d-4528-a3ba-f8c057eb9740" containerID="2f6f19966d913104bb9d9b6b3dcb56598b83c55d699e15c3546094db2db6f095" exitCode=143 Dec 02 08:04:33 crc kubenswrapper[4691]: I1202 08:04:33.736330 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8220218e-209d-4528-a3ba-f8c057eb9740","Type":"ContainerDied","Data":"c8c3d0acf15a96aa47f3fdcd2fdd95fbfeb5d73dd93568d87387ab51b7f50796"} Dec 02 08:04:33 crc kubenswrapper[4691]: I1202 08:04:33.736637 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8220218e-209d-4528-a3ba-f8c057eb9740","Type":"ContainerDied","Data":"2f6f19966d913104bb9d9b6b3dcb56598b83c55d699e15c3546094db2db6f095"} Dec 02 08:04:33 crc kubenswrapper[4691]: I1202 08:04:33.937347 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-977b585d5-q75dz"] Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.020326 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-758b4cf594-fpkds"] Dec 02 08:04:34 crc kubenswrapper[4691]: E1202 08:04:34.020740 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2be077f9-2259-48e5-818d-63804f9951ac" containerName="init" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.020766 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2be077f9-2259-48e5-818d-63804f9951ac" containerName="init" Dec 02 08:04:34 crc kubenswrapper[4691]: E1202 08:04:34.020797 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d177ac4d-6f73-464e-8c82-a4d525328bf0" containerName="init" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.020802 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d177ac4d-6f73-464e-8c82-a4d525328bf0" containerName="init" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.021167 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d177ac4d-6f73-464e-8c82-a4d525328bf0" containerName="init" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.021202 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2be077f9-2259-48e5-818d-63804f9951ac" containerName="init" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.022284 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.029056 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.045450 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758b4cf594-fpkds"] Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.161461 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-combined-ca-bundle\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.161521 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-secret-key\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.161563 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a95b5239-be71-4b06-88b2-52875915162e-logs\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.161635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-scripts\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.161686 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-config-data\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.161714 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-tls-certs\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.162019 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg54q\" (UniqueName: \"kubernetes.io/projected/a95b5239-be71-4b06-88b2-52875915162e-kube-api-access-kg54q\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.183201 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c4b8f5f8f-l7b75"] Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.223619 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6585c7db4b-jz894"] Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.225560 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263564 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-scripts\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263636 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-config-data\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263666 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-tls-certs\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263786 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg54q\" (UniqueName: \"kubernetes.io/projected/a95b5239-be71-4b06-88b2-52875915162e-kube-api-access-kg54q\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263823 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-combined-ca-bundle\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263845 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-secret-key\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.263881 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a95b5239-be71-4b06-88b2-52875915162e-logs\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.264237 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a95b5239-be71-4b06-88b2-52875915162e-logs\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.264747 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-scripts\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.270326 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6585c7db4b-jz894"] Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.280925 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-config-data\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.283603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-secret-key\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.285970 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-tls-certs\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.291316 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-combined-ca-bundle\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.315480 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg54q\" (UniqueName: \"kubernetes.io/projected/a95b5239-be71-4b06-88b2-52875915162e-kube-api-access-kg54q\") pod \"horizon-758b4cf594-fpkds\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369250 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76022e0c-2dd2-4395-8607-aa13da42f557-logs\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369380 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76022e0c-2dd2-4395-8607-aa13da42f557-scripts\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369409 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76022e0c-2dd2-4395-8607-aa13da42f557-config-data\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369445 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s44zc\" (UniqueName: \"kubernetes.io/projected/76022e0c-2dd2-4395-8607-aa13da42f557-kube-api-access-s44zc\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369476 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-horizon-tls-certs\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369504 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-combined-ca-bundle\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.369543 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-horizon-secret-key\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.412749 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.470882 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76022e0c-2dd2-4395-8607-aa13da42f557-config-data\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.470935 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76022e0c-2dd2-4395-8607-aa13da42f557-scripts\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.470984 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s44zc\" (UniqueName: \"kubernetes.io/projected/76022e0c-2dd2-4395-8607-aa13da42f557-kube-api-access-s44zc\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.471025 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-horizon-tls-certs\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.471071 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-combined-ca-bundle\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.471127 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-horizon-secret-key\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.471166 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76022e0c-2dd2-4395-8607-aa13da42f557-logs\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.471874 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76022e0c-2dd2-4395-8607-aa13da42f557-logs\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.471982 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/76022e0c-2dd2-4395-8607-aa13da42f557-scripts\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.472202 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/76022e0c-2dd2-4395-8607-aa13da42f557-config-data\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.476244 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-horizon-secret-key\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.476661 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-horizon-tls-certs\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.482391 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76022e0c-2dd2-4395-8607-aa13da42f557-combined-ca-bundle\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.491311 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s44zc\" (UniqueName: \"kubernetes.io/projected/76022e0c-2dd2-4395-8607-aa13da42f557-kube-api-access-s44zc\") pod \"horizon-6585c7db4b-jz894\" (UID: \"76022e0c-2dd2-4395-8607-aa13da42f557\") " pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: W1202 08:04:34.531302 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24910bce_2ac5_4966_af8e_48dad2b11370.slice/crio-79e9eab263c34f2eccc61b1229d709b86c0300ae4dd5d9c599466bc6129473a2 WatchSource:0}: Error finding container 79e9eab263c34f2eccc61b1229d709b86c0300ae4dd5d9c599466bc6129473a2: Status 404 returned error can't find the container with id 79e9eab263c34f2eccc61b1229d709b86c0300ae4dd5d9c599466bc6129473a2 Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.563285 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:04:34 crc kubenswrapper[4691]: I1202 08:04:34.748803 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4b8f5f8f-l7b75" event={"ID":"24910bce-2ac5-4966-af8e-48dad2b11370","Type":"ContainerStarted","Data":"79e9eab263c34f2eccc61b1229d709b86c0300ae4dd5d9c599466bc6129473a2"} Dec 02 08:04:37 crc kubenswrapper[4691]: I1202 08:04:37.174495 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Dec 02 08:04:42 crc kubenswrapper[4691]: I1202 08:04:42.174357 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: connect: connection refused" Dec 02 08:04:42 crc kubenswrapper[4691]: I1202 08:04:42.174804 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:42 crc kubenswrapper[4691]: E1202 08:04:42.249870 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 08:04:42 crc kubenswrapper[4691]: E1202 08:04:42.250052 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n665h66dh579h67h57h648h64fh64ch567h59ch8bh674h66dh5b9h86h5d4h65ch5b6h65bh5f6hbh667h559h5bbh577h664h5cfh69h688hc9h64h659q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ms6tl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-977b585d5-q75dz_openstack(c651616e-857f-4aae-a76b-79bf365695a3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:04:42 crc kubenswrapper[4691]: E1202 08:04:42.257464 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-977b585d5-q75dz" podUID="c651616e-857f-4aae-a76b-79bf365695a3" Dec 02 08:04:44 crc kubenswrapper[4691]: E1202 08:04:44.316219 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Dec 02 08:04:44 crc kubenswrapper[4691]: E1202 08:04:44.316956 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n67h5d9h6dh68fh564h569hddh77h549h65ch99h686h87h57h665h5f6hc6h9ch5c4h5d6hf4h5f6hfdhd8h668h648h5c9h599h5ffh59h67bh597q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2wcrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-584b486865-hjdcx_openstack(03a43183-f7c0-456d-8843-d94b1e97e51e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:04:44 crc kubenswrapper[4691]: E1202 08:04:44.320412 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-584b486865-hjdcx" podUID="03a43183-f7c0-456d-8843-d94b1e97e51e" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.369289 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.485234 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-config-data\") pod \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.485602 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hhhc\" (UniqueName: \"kubernetes.io/projected/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-kube-api-access-4hhhc\") pod \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.485833 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-credential-keys\") pod \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.485868 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-scripts\") pod \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.485902 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-combined-ca-bundle\") pod \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.486003 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-fernet-keys\") pod \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\" (UID: \"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f\") " Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.493603 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-scripts" (OuterVolumeSpecName: "scripts") pod "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" (UID: "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.493834 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-kube-api-access-4hhhc" (OuterVolumeSpecName: "kube-api-access-4hhhc") pod "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" (UID: "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f"). InnerVolumeSpecName "kube-api-access-4hhhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.498392 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" (UID: "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.498677 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" (UID: "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.527632 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" (UID: "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.525233 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-config-data" (OuterVolumeSpecName: "config-data") pod "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" (UID: "4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.605022 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.605053 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.605064 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hhhc\" (UniqueName: \"kubernetes.io/projected/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-kube-api-access-4hhhc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.605075 4691 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.605084 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.605094 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.856469 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-jr4fw" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.857801 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-jr4fw" event={"ID":"4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f","Type":"ContainerDied","Data":"745535875ba7fe9dcd1be2d4642caf28fbb95705efb80966bc17ae37139205e3"} Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.857834 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="745535875ba7fe9dcd1be2d4642caf28fbb95705efb80966bc17ae37139205e3" Dec 02 08:04:44 crc kubenswrapper[4691]: I1202 08:04:44.893771 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-758b4cf594-fpkds"] Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.460653 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-jr4fw"] Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.470084 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-jr4fw"] Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.558342 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-svfbx"] Dec 02 08:04:45 crc kubenswrapper[4691]: E1202 08:04:45.558958 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" containerName="keystone-bootstrap" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.558981 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" containerName="keystone-bootstrap" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.559230 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" containerName="keystone-bootstrap" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.559982 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.563033 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.563836 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.564031 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.571243 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lmmt4" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.576973 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-svfbx"] Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.577141 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.635147 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-config-data\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.635362 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-scripts\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.635418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-combined-ca-bundle\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.635590 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk78w\" (UniqueName: \"kubernetes.io/projected/94701dd1-f34b-4bdc-bf74-f67799127cd5-kube-api-access-tk78w\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.635670 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-credential-keys\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.635700 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-fernet-keys\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.737907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-scripts\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.737959 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-combined-ca-bundle\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.738020 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk78w\" (UniqueName: \"kubernetes.io/projected/94701dd1-f34b-4bdc-bf74-f67799127cd5-kube-api-access-tk78w\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.738053 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-credential-keys\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.738072 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-fernet-keys\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.738128 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-config-data\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.745367 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-credential-keys\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.745524 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-fernet-keys\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.754500 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-combined-ca-bundle\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.755679 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-config-data\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.757431 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-scripts\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.764438 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk78w\" (UniqueName: \"kubernetes.io/projected/94701dd1-f34b-4bdc-bf74-f67799127cd5-kube-api-access-tk78w\") pod \"keystone-bootstrap-svfbx\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:45 crc kubenswrapper[4691]: I1202 08:04:45.889304 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:04:46 crc kubenswrapper[4691]: I1202 08:04:46.572244 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f" path="/var/lib/kubelet/pods/4f0ebe2d-0a7d-460f-b7a4-8a6ff7bbcb9f/volumes" Dec 02 08:04:52 crc kubenswrapper[4691]: I1202 08:04:52.174245 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 02 08:04:52 crc kubenswrapper[4691]: I1202 08:04:52.842723 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:04:52 crc kubenswrapper[4691]: I1202 08:04:52.842793 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.214438 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.226164 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266128 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-scripts\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266182 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-logs\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266241 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-config-data\") pod \"c651616e-857f-4aae-a76b-79bf365695a3\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266288 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266389 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8cjc\" (UniqueName: \"kubernetes.io/projected/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-kube-api-access-w8cjc\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266411 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c651616e-857f-4aae-a76b-79bf365695a3-horizon-secret-key\") pod \"c651616e-857f-4aae-a76b-79bf365695a3\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266485 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-config-data\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266554 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms6tl\" (UniqueName: \"kubernetes.io/projected/c651616e-857f-4aae-a76b-79bf365695a3-kube-api-access-ms6tl\") pod \"c651616e-857f-4aae-a76b-79bf365695a3\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266573 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651616e-857f-4aae-a76b-79bf365695a3-logs\") pod \"c651616e-857f-4aae-a76b-79bf365695a3\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266610 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-combined-ca-bundle\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.266649 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-scripts\") pod \"c651616e-857f-4aae-a76b-79bf365695a3\" (UID: \"c651616e-857f-4aae-a76b-79bf365695a3\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.267591 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-httpd-run\") pod \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\" (UID: \"67de45fd-9469-4c5f-aee4-cc4c8bb309c1\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.268787 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.270619 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-config-data" (OuterVolumeSpecName: "config-data") pod "c651616e-857f-4aae-a76b-79bf365695a3" (UID: "c651616e-857f-4aae-a76b-79bf365695a3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.270667 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-logs" (OuterVolumeSpecName: "logs") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.271013 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c651616e-857f-4aae-a76b-79bf365695a3-logs" (OuterVolumeSpecName: "logs") pod "c651616e-857f-4aae-a76b-79bf365695a3" (UID: "c651616e-857f-4aae-a76b-79bf365695a3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.271870 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-scripts" (OuterVolumeSpecName: "scripts") pod "c651616e-857f-4aae-a76b-79bf365695a3" (UID: "c651616e-857f-4aae-a76b-79bf365695a3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.275557 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c651616e-857f-4aae-a76b-79bf365695a3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "c651616e-857f-4aae-a76b-79bf365695a3" (UID: "c651616e-857f-4aae-a76b-79bf365695a3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.301999 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c651616e-857f-4aae-a76b-79bf365695a3-kube-api-access-ms6tl" (OuterVolumeSpecName: "kube-api-access-ms6tl") pod "c651616e-857f-4aae-a76b-79bf365695a3" (UID: "c651616e-857f-4aae-a76b-79bf365695a3"). InnerVolumeSpecName "kube-api-access-ms6tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.302024 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-kube-api-access-w8cjc" (OuterVolumeSpecName: "kube-api-access-w8cjc") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "kube-api-access-w8cjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.302002 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-scripts" (OuterVolumeSpecName: "scripts") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.302171 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.306208 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.331812 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-config-data" (OuterVolumeSpecName: "config-data") pod "67de45fd-9469-4c5f-aee4-cc4c8bb309c1" (UID: "67de45fd-9469-4c5f-aee4-cc4c8bb309c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.369691 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms6tl\" (UniqueName: \"kubernetes.io/projected/c651616e-857f-4aae-a76b-79bf365695a3-kube-api-access-ms6tl\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.369995 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c651616e-857f-4aae-a76b-79bf365695a3-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370089 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370169 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370269 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370351 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370427 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370588 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c651616e-857f-4aae-a76b-79bf365695a3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370696 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370790 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8cjc\" (UniqueName: \"kubernetes.io/projected/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-kube-api-access-w8cjc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370877 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/c651616e-857f-4aae-a76b-79bf365695a3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.370951 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67de45fd-9469-4c5f-aee4-cc4c8bb309c1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.393451 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.448050 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.448348 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.472522 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: E1202 08:04:53.653363 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 02 08:04:53 crc kubenswrapper[4691]: E1202 08:04:53.653553 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b6xrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-7sh8f_openstack(7f939f3c-07b4-42b8-94d9-3dbd15c03287): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:04:53 crc kubenswrapper[4691]: E1202 08:04:53.654835 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-7sh8f" podUID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" Dec 02 08:04:53 crc kubenswrapper[4691]: W1202 08:04:53.672100 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda95b5239_be71_4b06_88b2_52875915162e.slice/crio-43b683a099f5e7a29c140d75614e74d4d36686f4d8c86ef07a7a20205f8acc30 WatchSource:0}: Error finding container 43b683a099f5e7a29c140d75614e74d4d36686f4d8c86ef07a7a20205f8acc30: Status 404 returned error can't find the container with id 43b683a099f5e7a29c140d75614e74d4d36686f4d8c86ef07a7a20205f8acc30 Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.674161 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.680873 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.692057 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.878916 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-nb\") pod \"7fc34093-aa4d-4e35-992b-c071d31705d5\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.878985 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-scripts\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879058 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-config-data\") pod \"03a43183-f7c0-456d-8843-d94b1e97e51e\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879103 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5qnb\" (UniqueName: \"kubernetes.io/projected/7fc34093-aa4d-4e35-992b-c071d31705d5-kube-api-access-c5qnb\") pod \"7fc34093-aa4d-4e35-992b-c071d31705d5\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879144 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-config\") pod \"7fc34093-aa4d-4e35-992b-c071d31705d5\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879202 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6hqg\" (UniqueName: \"kubernetes.io/projected/8220218e-209d-4528-a3ba-f8c057eb9740-kube-api-access-s6hqg\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879254 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-svc\") pod \"7fc34093-aa4d-4e35-992b-c071d31705d5\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879279 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-scripts\") pod \"03a43183-f7c0-456d-8843-d94b1e97e51e\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879316 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879400 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-logs\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879435 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-sb\") pod \"7fc34093-aa4d-4e35-992b-c071d31705d5\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879460 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-combined-ca-bundle\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879485 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/03a43183-f7c0-456d-8843-d94b1e97e51e-horizon-secret-key\") pod \"03a43183-f7c0-456d-8843-d94b1e97e51e\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879516 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-config-data\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879550 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03a43183-f7c0-456d-8843-d94b1e97e51e-logs\") pod \"03a43183-f7c0-456d-8843-d94b1e97e51e\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879570 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-httpd-run\") pod \"8220218e-209d-4528-a3ba-f8c057eb9740\" (UID: \"8220218e-209d-4528-a3ba-f8c057eb9740\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879594 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wcrm\" (UniqueName: \"kubernetes.io/projected/03a43183-f7c0-456d-8843-d94b1e97e51e-kube-api-access-2wcrm\") pod \"03a43183-f7c0-456d-8843-d94b1e97e51e\" (UID: \"03a43183-f7c0-456d-8843-d94b1e97e51e\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.879616 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-swift-storage-0\") pod \"7fc34093-aa4d-4e35-992b-c071d31705d5\" (UID: \"7fc34093-aa4d-4e35-992b-c071d31705d5\") " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.880567 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-config-data" (OuterVolumeSpecName: "config-data") pod "03a43183-f7c0-456d-8843-d94b1e97e51e" (UID: "03a43183-f7c0-456d-8843-d94b1e97e51e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.881462 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-scripts" (OuterVolumeSpecName: "scripts") pod "03a43183-f7c0-456d-8843-d94b1e97e51e" (UID: "03a43183-f7c0-456d-8843-d94b1e97e51e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.881811 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-logs" (OuterVolumeSpecName: "logs") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.882209 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.882991 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03a43183-f7c0-456d-8843-d94b1e97e51e-logs" (OuterVolumeSpecName: "logs") pod "03a43183-f7c0-456d-8843-d94b1e97e51e" (UID: "03a43183-f7c0-456d-8843-d94b1e97e51e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.884533 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc34093-aa4d-4e35-992b-c071d31705d5-kube-api-access-c5qnb" (OuterVolumeSpecName: "kube-api-access-c5qnb") pod "7fc34093-aa4d-4e35-992b-c071d31705d5" (UID: "7fc34093-aa4d-4e35-992b-c071d31705d5"). InnerVolumeSpecName "kube-api-access-c5qnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.885229 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8220218e-209d-4528-a3ba-f8c057eb9740-kube-api-access-s6hqg" (OuterVolumeSpecName: "kube-api-access-s6hqg") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "kube-api-access-s6hqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.887001 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-scripts" (OuterVolumeSpecName: "scripts") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.887960 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.891247 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03a43183-f7c0-456d-8843-d94b1e97e51e-kube-api-access-2wcrm" (OuterVolumeSpecName: "kube-api-access-2wcrm") pod "03a43183-f7c0-456d-8843-d94b1e97e51e" (UID: "03a43183-f7c0-456d-8843-d94b1e97e51e"). InnerVolumeSpecName "kube-api-access-2wcrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.894939 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03a43183-f7c0-456d-8843-d94b1e97e51e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "03a43183-f7c0-456d-8843-d94b1e97e51e" (UID: "03a43183-f7c0-456d-8843-d94b1e97e51e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.912140 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.923053 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7fc34093-aa4d-4e35-992b-c071d31705d5" (UID: "7fc34093-aa4d-4e35-992b-c071d31705d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.926071 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7fc34093-aa4d-4e35-992b-c071d31705d5" (UID: "7fc34093-aa4d-4e35-992b-c071d31705d5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.930889 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7fc34093-aa4d-4e35-992b-c071d31705d5" (UID: "7fc34093-aa4d-4e35-992b-c071d31705d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.932043 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-config-data" (OuterVolumeSpecName: "config-data") pod "8220218e-209d-4528-a3ba-f8c057eb9740" (UID: "8220218e-209d-4528-a3ba-f8c057eb9740"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.932555 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-config" (OuterVolumeSpecName: "config") pod "7fc34093-aa4d-4e35-992b-c071d31705d5" (UID: "7fc34093-aa4d-4e35-992b-c071d31705d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.938465 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7fc34093-aa4d-4e35-992b-c071d31705d5" (UID: "7fc34093-aa4d-4e35-992b-c071d31705d5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981485 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981516 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6hqg\" (UniqueName: \"kubernetes.io/projected/8220218e-209d-4528-a3ba-f8c057eb9740-kube-api-access-s6hqg\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981528 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981537 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981576 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981587 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981597 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981608 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981616 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/03a43183-f7c0-456d-8843-d94b1e97e51e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981624 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981631 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/03a43183-f7c0-456d-8843-d94b1e97e51e-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981641 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8220218e-209d-4528-a3ba-f8c057eb9740-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981649 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wcrm\" (UniqueName: \"kubernetes.io/projected/03a43183-f7c0-456d-8843-d94b1e97e51e-kube-api-access-2wcrm\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981657 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981667 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7fc34093-aa4d-4e35-992b-c071d31705d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981675 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8220218e-209d-4528-a3ba-f8c057eb9740-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981684 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/03a43183-f7c0-456d-8843-d94b1e97e51e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.981694 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5qnb\" (UniqueName: \"kubernetes.io/projected/7fc34093-aa4d-4e35-992b-c071d31705d5-kube-api-access-c5qnb\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:53 crc kubenswrapper[4691]: I1202 08:04:53.998677 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.083614 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.118642 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-584b486865-hjdcx" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.118654 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-584b486865-hjdcx" event={"ID":"03a43183-f7c0-456d-8843-d94b1e97e51e","Type":"ContainerDied","Data":"a49e8a1d540d83969f4ca6bec849f0957c3b4129ac1250acc0fab1420c24189d"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.124780 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" event={"ID":"7fc34093-aa4d-4e35-992b-c071d31705d5","Type":"ContainerDied","Data":"a02e4dfb5e36f1b0af7065957e3ee7bdec2f7e49b4656239541f8eb8eac7ef21"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.124990 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.125026 4691 scope.go:117] "RemoveContainer" containerID="6e1fec54f20d4f1739f4c040799a33a98a9ef900f8d37ccc9d4b9737485f74e7" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.126670 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-977b585d5-q75dz" event={"ID":"c651616e-857f-4aae-a76b-79bf365695a3","Type":"ContainerDied","Data":"51ff8f57b1aba70137d34831d35183108644c5e0b4758168c713244494e65377"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.126794 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-977b585d5-q75dz" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.131605 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"67de45fd-9469-4c5f-aee4-cc4c8bb309c1","Type":"ContainerDied","Data":"24ac254e05ae2a02e9668be75613993d323725f44a50e08fe0a3e98a8758f54f"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.131724 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.135110 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758b4cf594-fpkds" event={"ID":"a95b5239-be71-4b06-88b2-52875915162e","Type":"ContainerStarted","Data":"43b683a099f5e7a29c140d75614e74d4d36686f4d8c86ef07a7a20205f8acc30"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.142155 4691 generic.go:334] "Generic (PLEG): container finished" podID="d7c46467-e60b-47fd-be7b-660d674b6504" containerID="7f40968c6bdee9e8fb71c6ec0ba22bcb3f072ed076a6b75797013af2e2e45f94" exitCode=0 Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.142358 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4tvs4" event={"ID":"d7c46467-e60b-47fd-be7b-660d674b6504","Type":"ContainerDied","Data":"7f40968c6bdee9e8fb71c6ec0ba22bcb3f072ed076a6b75797013af2e2e45f94"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.148847 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8220218e-209d-4528-a3ba-f8c057eb9740","Type":"ContainerDied","Data":"3d07bb8f922602ec23963563cc432bbf90d7001450ce7db941f44e3fd8913545"} Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.148896 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.153125 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-7sh8f" podUID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.233316 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-29vqc"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.241098 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-29vqc"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.273235 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-977b585d5-q75dz"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.297799 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-977b585d5-q75dz"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.344165 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-584b486865-hjdcx"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.355017 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-584b486865-hjdcx"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.365432 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.378744 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.451503 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.473603 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.492698 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.493677 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.493837 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.493924 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-httpd" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.494019 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-httpd" Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.494147 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-log" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.494236 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-log" Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.494338 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-log" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.494429 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-log" Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.494535 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-httpd" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.494613 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-httpd" Dec 02 08:04:54 crc kubenswrapper[4691]: E1202 08:04:54.494707 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="init" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.494823 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="init" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.495446 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-httpd" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.495520 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-httpd" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.495580 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.495649 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" containerName="glance-log" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.495713 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" containerName="glance-log" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.497328 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.502671 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.502716 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-5ngw7" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.503194 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.503389 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.508148 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.512057 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.515818 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.517826 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.522856 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.533102 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.579586 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03a43183-f7c0-456d-8843-d94b1e97e51e" path="/var/lib/kubelet/pods/03a43183-f7c0-456d-8843-d94b1e97e51e/volumes" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.580815 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67de45fd-9469-4c5f-aee4-cc4c8bb309c1" path="/var/lib/kubelet/pods/67de45fd-9469-4c5f-aee4-cc4c8bb309c1/volumes" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.582004 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" path="/var/lib/kubelet/pods/7fc34093-aa4d-4e35-992b-c071d31705d5/volumes" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.587188 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8220218e-209d-4528-a3ba-f8c057eb9740" path="/var/lib/kubelet/pods/8220218e-209d-4528-a3ba-f8c057eb9740/volumes" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.588262 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c651616e-857f-4aae-a76b-79bf365695a3" path="/var/lib/kubelet/pods/c651616e-857f-4aae-a76b-79bf365695a3/volumes" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647485 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647539 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-logs\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647576 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647597 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-logs\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647626 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z22j\" (UniqueName: \"kubernetes.io/projected/0f883f9e-2ece-4a76-85c8-46cca73e0796-kube-api-access-2z22j\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647648 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647672 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647707 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647725 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647743 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647808 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.647960 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.648020 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.648053 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.648078 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2h7\" (UniqueName: \"kubernetes.io/projected/0a7091d4-3a8f-4c52-b3e8-b35913233371-kube-api-access-7x2h7\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.648112 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750035 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750093 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750130 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750179 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750205 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750233 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2h7\" (UniqueName: \"kubernetes.io/projected/0a7091d4-3a8f-4c52-b3e8-b35913233371-kube-api-access-7x2h7\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750288 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750355 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750381 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-logs\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750432 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750463 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-logs\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750510 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z22j\" (UniqueName: \"kubernetes.io/projected/0f883f9e-2ece-4a76-85c8-46cca73e0796-kube-api-access-2z22j\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750554 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750594 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.750647 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.751847 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-logs\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.751842 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.751924 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.752638 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.753302 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.754837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-logs\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.757203 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.758015 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-config-data\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.759343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.759997 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.760649 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.761072 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-scripts\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.762085 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.772351 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.774741 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2h7\" (UniqueName: \"kubernetes.io/projected/0a7091d4-3a8f-4c52-b3e8-b35913233371-kube-api-access-7x2h7\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.776287 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z22j\" (UniqueName: \"kubernetes.io/projected/0f883f9e-2ece-4a76-85c8-46cca73e0796-kube-api-access-2z22j\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.781520 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.784233 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.834038 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:04:54 crc kubenswrapper[4691]: I1202 08:04:54.844624 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.432157 4691 scope.go:117] "RemoveContainer" containerID="e524a5b59a0465f5b3325213907faa9317da3ac7ddb909d796c2aaf09952f508" Dec 02 08:04:55 crc kubenswrapper[4691]: E1202 08:04:55.567984 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 02 08:04:55 crc kubenswrapper[4691]: E1202 08:04:55.568274 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6s5hb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-6qz2r_openstack(a217e1fe-be30-4247-91f3-020aaa089689): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:04:55 crc kubenswrapper[4691]: E1202 08:04:55.571819 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-6qz2r" podUID="a217e1fe-be30-4247-91f3-020aaa089689" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.750873 4691 scope.go:117] "RemoveContainer" containerID="f8f10f123d04d7d2f17756dd65b88392017669549bde4d2b5d5dad6a47e87b6f" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.800659 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.812700 4691 scope.go:117] "RemoveContainer" containerID="8fcb8d0f1490fc71c88119d16d3f65030b121e7645cab0a7e1803ed573ad9c8c" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.902845 4691 scope.go:117] "RemoveContainer" containerID="c8c3d0acf15a96aa47f3fdcd2fdd95fbfeb5d73dd93568d87387ab51b7f50796" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.937122 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt7bf\" (UniqueName: \"kubernetes.io/projected/d7c46467-e60b-47fd-be7b-660d674b6504-kube-api-access-wt7bf\") pod \"d7c46467-e60b-47fd-be7b-660d674b6504\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.937219 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-config\") pod \"d7c46467-e60b-47fd-be7b-660d674b6504\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.937242 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-combined-ca-bundle\") pod \"d7c46467-e60b-47fd-be7b-660d674b6504\" (UID: \"d7c46467-e60b-47fd-be7b-660d674b6504\") " Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.968586 4691 scope.go:117] "RemoveContainer" containerID="2f6f19966d913104bb9d9b6b3dcb56598b83c55d699e15c3546094db2db6f095" Dec 02 08:04:55 crc kubenswrapper[4691]: I1202 08:04:55.979169 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7c46467-e60b-47fd-be7b-660d674b6504-kube-api-access-wt7bf" (OuterVolumeSpecName: "kube-api-access-wt7bf") pod "d7c46467-e60b-47fd-be7b-660d674b6504" (UID: "d7c46467-e60b-47fd-be7b-660d674b6504"). InnerVolumeSpecName "kube-api-access-wt7bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.039353 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt7bf\" (UniqueName: \"kubernetes.io/projected/d7c46467-e60b-47fd-be7b-660d674b6504-kube-api-access-wt7bf\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.096523 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-config" (OuterVolumeSpecName: "config") pod "d7c46467-e60b-47fd-be7b-660d674b6504" (UID: "d7c46467-e60b-47fd-be7b-660d674b6504"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.105537 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d7c46467-e60b-47fd-be7b-660d674b6504" (UID: "d7c46467-e60b-47fd-be7b-660d674b6504"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.136665 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6585c7db4b-jz894"] Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.141121 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.141150 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7c46467-e60b-47fd-be7b-660d674b6504-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.201535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6585c7db4b-jz894" event={"ID":"76022e0c-2dd2-4395-8607-aa13da42f557","Type":"ContainerStarted","Data":"5ddb6ea6ed178665078e4ae0d85bda7ecd814bfbca8cd6ad39432c9fba1243ae"} Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.203865 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758b4cf594-fpkds" event={"ID":"a95b5239-be71-4b06-88b2-52875915162e","Type":"ContainerStarted","Data":"e8af3edd4341dbc165ca8c818abe8117ef90323a541e8dbd9510d558ed71bb9f"} Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.209846 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerStarted","Data":"e69205269ae58f6dcbb3393c3e2d577675c6f416145c65fe9bb77fe4cfbb3d48"} Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.255022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2zwm7" event={"ID":"570d8c0d-670c-4132-85e4-e13633c3bcc2","Type":"ContainerStarted","Data":"225c4639c39f31f374a90598a5cf3035e03cc249f5b00247468d64d85a944dd3"} Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.272805 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-svfbx"] Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.286832 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2zwm7" podStartSLOduration=3.31374627 podStartE2EDuration="36.286813264s" podCreationTimestamp="2025-12-02 08:04:20 +0000 UTC" firstStartedPulling="2025-12-02 08:04:22.384948337 +0000 UTC m=+1110.169027199" lastFinishedPulling="2025-12-02 08:04:55.358015331 +0000 UTC m=+1143.142094193" observedRunningTime="2025-12-02 08:04:56.283193135 +0000 UTC m=+1144.067271997" watchObservedRunningTime="2025-12-02 08:04:56.286813264 +0000 UTC m=+1144.070892126" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.293216 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-4tvs4" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.293451 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-4tvs4" event={"ID":"d7c46467-e60b-47fd-be7b-660d674b6504","Type":"ContainerDied","Data":"72b309b7af526586fb0c23c4468986a39a90292fd0a9345291847285ac400f07"} Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.293501 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72b309b7af526586fb0c23c4468986a39a90292fd0a9345291847285ac400f07" Dec 02 08:04:56 crc kubenswrapper[4691]: E1202 08:04:56.297780 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-6qz2r" podUID="a217e1fe-be30-4247-91f3-020aaa089689" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.302435 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.508782 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-qrkdk"] Dec 02 08:04:56 crc kubenswrapper[4691]: E1202 08:04:56.509560 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7c46467-e60b-47fd-be7b-660d674b6504" containerName="neutron-db-sync" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.509578 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7c46467-e60b-47fd-be7b-660d674b6504" containerName="neutron-db-sync" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.520053 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7c46467-e60b-47fd-be7b-660d674b6504" containerName="neutron-db-sync" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.521580 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.523155 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-qrkdk"] Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.666909 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.667099 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.667131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-config\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.667158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjp99\" (UniqueName: \"kubernetes.io/projected/fb8ecb10-bbe6-4998-83cb-97441258d0c9-kube-api-access-qjp99\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.667203 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.667417 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-svc\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.672732 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-54f65c9796-xhrrr"] Dec 02 08:04:56 crc kubenswrapper[4691]: W1202 08:04:56.673550 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a7091d4_3a8f_4c52_b3e8_b35913233371.slice/crio-fa065bfcb973e8808a2302cbfb4c688ec66ef43aa9372ae722364157f5a62ef7 WatchSource:0}: Error finding container fa065bfcb973e8808a2302cbfb4c688ec66ef43aa9372ae722364157f5a62ef7: Status 404 returned error can't find the container with id fa065bfcb973e8808a2302cbfb4c688ec66ef43aa9372ae722364157f5a62ef7 Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.677413 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.680139 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nkqg6" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.680401 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.680620 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.689710 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.693575 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54f65c9796-xhrrr"] Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.750225 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769021 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769087 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-ovndb-tls-certs\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769105 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-combined-ca-bundle\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769153 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lktp2\" (UniqueName: \"kubernetes.io/projected/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-kube-api-access-lktp2\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769174 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-svc\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769339 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769610 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-httpd-config\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769639 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-config\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769667 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769686 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-config\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.769705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjp99\" (UniqueName: \"kubernetes.io/projected/fb8ecb10-bbe6-4998-83cb-97441258d0c9-kube-api-access-qjp99\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.770174 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.770841 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.771151 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.771391 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-config\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.781933 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-svc\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.792713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjp99\" (UniqueName: \"kubernetes.io/projected/fb8ecb10-bbe6-4998-83cb-97441258d0c9-kube-api-access-qjp99\") pod \"dnsmasq-dns-55f844cf75-qrkdk\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:56 crc kubenswrapper[4691]: I1202 08:04:56.805058 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.051363 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-config\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.051530 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-ovndb-tls-certs\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.051554 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-combined-ca-bundle\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.051673 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lktp2\" (UniqueName: \"kubernetes.io/projected/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-kube-api-access-lktp2\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.051821 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-httpd-config\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.068890 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.086714 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-httpd-config\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.089263 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-config\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.091478 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-ovndb-tls-certs\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.093260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-combined-ca-bundle\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.095698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lktp2\" (UniqueName: \"kubernetes.io/projected/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-kube-api-access-lktp2\") pod \"neutron-54f65c9796-xhrrr\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.135317 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.178189 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-764c5664d7-29vqc" podUID="7fc34093-aa4d-4e35-992b-c071d31705d5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.124:5353: i/o timeout" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.403489 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6585c7db4b-jz894" event={"ID":"76022e0c-2dd2-4395-8607-aa13da42f557","Type":"ContainerStarted","Data":"e942f85fdcb6e48c185f4fca5e85867db7312e614ddc6739a3f2fd4bae84d024"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.403804 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6585c7db4b-jz894" event={"ID":"76022e0c-2dd2-4395-8607-aa13da42f557","Type":"ContainerStarted","Data":"0d35f9365001d7eb9b008c16b4c8bb52a31f1442176990a7a2f64deeaba1a6cd"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.456085 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758b4cf594-fpkds" event={"ID":"a95b5239-be71-4b06-88b2-52875915162e","Type":"ContainerStarted","Data":"fe343d6838838b4d379b0b348314566df9f6e06b542365de413698b6e144a093"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.471032 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f883f9e-2ece-4a76-85c8-46cca73e0796","Type":"ContainerStarted","Data":"afae584a0dad11072297679c5ef29d923e1ff38328f7f242107907ce5bd2c291"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.484340 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-svfbx" event={"ID":"94701dd1-f34b-4bdc-bf74-f67799127cd5","Type":"ContainerStarted","Data":"2bcee9f6f940f70280315d05b9510f179ada01021af10d3e5e1665f8a838fe67"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.484385 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-svfbx" event={"ID":"94701dd1-f34b-4bdc-bf74-f67799127cd5","Type":"ContainerStarted","Data":"bacfa7dbbe55b6d395d216ecbbd5e979606a5ffd8212a8466766eb4555876d44"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.504986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a7091d4-3a8f-4c52-b3e8-b35913233371","Type":"ContainerStarted","Data":"fa065bfcb973e8808a2302cbfb4c688ec66ef43aa9372ae722364157f5a62ef7"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.534538 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6585c7db4b-jz894" podStartSLOduration=23.534516824 podStartE2EDuration="23.534516824s" podCreationTimestamp="2025-12-02 08:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:57.533712735 +0000 UTC m=+1145.317791597" watchObservedRunningTime="2025-12-02 08:04:57.534516824 +0000 UTC m=+1145.318595686" Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.854312 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4b8f5f8f-l7b75" event={"ID":"24910bce-2ac5-4966-af8e-48dad2b11370","Type":"ContainerStarted","Data":"f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.854348 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4b8f5f8f-l7b75" event={"ID":"24910bce-2ac5-4966-af8e-48dad2b11370","Type":"ContainerStarted","Data":"4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667"} Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.854452 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c4b8f5f8f-l7b75" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon-log" containerID="cri-o://f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74" gracePeriod=30 Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.854542 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7c4b8f5f8f-l7b75" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon" containerID="cri-o://4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667" gracePeriod=30 Dec 02 08:04:57 crc kubenswrapper[4691]: I1202 08:04:57.900462 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-758b4cf594-fpkds" podStartSLOduration=22.816258166 podStartE2EDuration="24.90044596s" podCreationTimestamp="2025-12-02 08:04:33 +0000 UTC" firstStartedPulling="2025-12-02 08:04:53.679272894 +0000 UTC m=+1141.463351756" lastFinishedPulling="2025-12-02 08:04:55.763460688 +0000 UTC m=+1143.547539550" observedRunningTime="2025-12-02 08:04:57.896678037 +0000 UTC m=+1145.680756899" watchObservedRunningTime="2025-12-02 08:04:57.90044596 +0000 UTC m=+1145.684524822" Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.001311 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="f14bc2d4-ce0c-440d-9e1d-15b0b8716562" containerName="galera" probeResult="failure" output="command timed out" Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.054425 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-svfbx" podStartSLOduration=13.054400165 podStartE2EDuration="13.054400165s" podCreationTimestamp="2025-12-02 08:04:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:04:57.949536337 +0000 UTC m=+1145.733615199" watchObservedRunningTime="2025-12-02 08:04:58.054400165 +0000 UTC m=+1145.838479027" Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.055207 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7c4b8f5f8f-l7b75" podStartSLOduration=14.139766189 podStartE2EDuration="35.055200684s" podCreationTimestamp="2025-12-02 08:04:23 +0000 UTC" firstStartedPulling="2025-12-02 08:04:34.535770717 +0000 UTC m=+1122.319849579" lastFinishedPulling="2025-12-02 08:04:55.451205212 +0000 UTC m=+1143.235284074" observedRunningTime="2025-12-02 08:04:58.046218804 +0000 UTC m=+1145.830297666" watchObservedRunningTime="2025-12-02 08:04:58.055200684 +0000 UTC m=+1145.839279546" Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.396434 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-qrkdk"] Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.826693 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-54f65c9796-xhrrr"] Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.873182 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" event={"ID":"fb8ecb10-bbe6-4998-83cb-97441258d0c9","Type":"ContainerStarted","Data":"0ce5bf8864400b065cd968591033518daa777fa46a186c6b51011a8411cbe770"} Dec 02 08:04:58 crc kubenswrapper[4691]: I1202 08:04:58.875866 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f65c9796-xhrrr" event={"ID":"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4","Type":"ContainerStarted","Data":"93639c790695e3a5bab26dbed6b0cb96b131177716dde75edf91576dc4d213ec"} Dec 02 08:04:59 crc kubenswrapper[4691]: I1202 08:04:59.941947 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f883f9e-2ece-4a76-85c8-46cca73e0796","Type":"ContainerStarted","Data":"07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08"} Dec 02 08:04:59 crc kubenswrapper[4691]: I1202 08:04:59.948466 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a7091d4-3a8f-4c52-b3e8-b35913233371","Type":"ContainerStarted","Data":"d33831a399d765d53b63507e333fdb5487277f3621ed518e8fa5a7dc61efa222"} Dec 02 08:04:59 crc kubenswrapper[4691]: I1202 08:04:59.963320 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f65c9796-xhrrr" event={"ID":"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4","Type":"ContainerStarted","Data":"71151114337b8625dba3b9392398eed09d96a2c6af0ac7b621b5a223d7aac372"} Dec 02 08:04:59 crc kubenswrapper[4691]: I1202 08:04:59.966164 4691 generic.go:334] "Generic (PLEG): container finished" podID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerID="e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4" exitCode=0 Dec 02 08:04:59 crc kubenswrapper[4691]: I1202 08:04:59.966207 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" event={"ID":"fb8ecb10-bbe6-4998-83cb-97441258d0c9","Type":"ContainerDied","Data":"e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4"} Dec 02 08:05:00 crc kubenswrapper[4691]: I1202 08:05:00.985379 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f883f9e-2ece-4a76-85c8-46cca73e0796","Type":"ContainerStarted","Data":"e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5"} Dec 02 08:05:00 crc kubenswrapper[4691]: I1202 08:05:00.991648 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a7091d4-3a8f-4c52-b3e8-b35913233371","Type":"ContainerStarted","Data":"b0b5d57b0a480c140403eaeb18a43bbd953094018058d8bc77a568ea99f8c0ea"} Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.001328 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f65c9796-xhrrr" event={"ID":"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4","Type":"ContainerStarted","Data":"2120bbc5ef060ade1923bcfcc752251e72b4c912cc46b65a5f836d511bff345d"} Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.001803 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.011095 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" event={"ID":"fb8ecb10-bbe6-4998-83cb-97441258d0c9","Type":"ContainerStarted","Data":"d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6"} Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.041627 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.041608238 podStartE2EDuration="7.041608238s" podCreationTimestamp="2025-12-02 08:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:01.012362039 +0000 UTC m=+1148.796440901" watchObservedRunningTime="2025-12-02 08:05:01.041608238 +0000 UTC m=+1148.825687090" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.078813 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-54f65c9796-xhrrr" podStartSLOduration=5.078791282 podStartE2EDuration="5.078791282s" podCreationTimestamp="2025-12-02 08:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:01.068975531 +0000 UTC m=+1148.853054403" watchObservedRunningTime="2025-12-02 08:05:01.078791282 +0000 UTC m=+1148.862870144" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.351137 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-95567dd97-rcpxr"] Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.353178 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.359830 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.360147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.383501 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-95567dd97-rcpxr"] Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465398 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-internal-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465454 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-ovndb-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465493 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-public-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465517 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-config\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465546 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t9zh\" (UniqueName: \"kubernetes.io/projected/22255ebb-1831-4d2e-966b-1ae2fee83ebf-kube-api-access-4t9zh\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465655 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-httpd-config\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.465672 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-combined-ca-bundle\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567411 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-combined-ca-bundle\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567463 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-httpd-config\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567585 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-internal-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567616 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-ovndb-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567651 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-public-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567678 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-config\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.567710 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t9zh\" (UniqueName: \"kubernetes.io/projected/22255ebb-1831-4d2e-966b-1ae2fee83ebf-kube-api-access-4t9zh\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.575501 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-combined-ca-bundle\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.576219 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-ovndb-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.576962 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-config\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.577268 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-public-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.583998 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-internal-tls-certs\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.588550 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/22255ebb-1831-4d2e-966b-1ae2fee83ebf-httpd-config\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.590842 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t9zh\" (UniqueName: \"kubernetes.io/projected/22255ebb-1831-4d2e-966b-1ae2fee83ebf-kube-api-access-4t9zh\") pod \"neutron-95567dd97-rcpxr\" (UID: \"22255ebb-1831-4d2e-966b-1ae2fee83ebf\") " pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:01 crc kubenswrapper[4691]: I1202 08:05:01.695808 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:02 crc kubenswrapper[4691]: I1202 08:05:02.073367 4691 generic.go:334] "Generic (PLEG): container finished" podID="570d8c0d-670c-4132-85e4-e13633c3bcc2" containerID="225c4639c39f31f374a90598a5cf3035e03cc249f5b00247468d64d85a944dd3" exitCode=0 Dec 02 08:05:02 crc kubenswrapper[4691]: I1202 08:05:02.075134 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2zwm7" event={"ID":"570d8c0d-670c-4132-85e4-e13633c3bcc2","Type":"ContainerDied","Data":"225c4639c39f31f374a90598a5cf3035e03cc249f5b00247468d64d85a944dd3"} Dec 02 08:05:02 crc kubenswrapper[4691]: I1202 08:05:02.075707 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:05:02 crc kubenswrapper[4691]: I1202 08:05:02.112187 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=8.112168645 podStartE2EDuration="8.112168645s" podCreationTimestamp="2025-12-02 08:04:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:02.093132067 +0000 UTC m=+1149.877210929" watchObservedRunningTime="2025-12-02 08:05:02.112168645 +0000 UTC m=+1149.896247497" Dec 02 08:05:02 crc kubenswrapper[4691]: I1202 08:05:02.123507 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" podStartSLOduration=6.123490274 podStartE2EDuration="6.123490274s" podCreationTimestamp="2025-12-02 08:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:02.113844796 +0000 UTC m=+1149.897923658" watchObservedRunningTime="2025-12-02 08:05:02.123490274 +0000 UTC m=+1149.907569136" Dec 02 08:05:02 crc kubenswrapper[4691]: I1202 08:05:02.644440 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-95567dd97-rcpxr"] Dec 02 08:05:02 crc kubenswrapper[4691]: W1202 08:05:02.649816 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22255ebb_1831_4d2e_966b_1ae2fee83ebf.slice/crio-b52ae3e4b1c213397f6f04e8b8d1a0416f6dead161a570d9f8b86588009e2679 WatchSource:0}: Error finding container b52ae3e4b1c213397f6f04e8b8d1a0416f6dead161a570d9f8b86588009e2679: Status 404 returned error can't find the container with id b52ae3e4b1c213397f6f04e8b8d1a0416f6dead161a570d9f8b86588009e2679 Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.093166 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-95567dd97-rcpxr" event={"ID":"22255ebb-1831-4d2e-966b-1ae2fee83ebf","Type":"ContainerStarted","Data":"9d84e8149916709e3f94f5c3e05674416a8b421254e1c0715a4120f334b36712"} Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.093551 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-95567dd97-rcpxr" event={"ID":"22255ebb-1831-4d2e-966b-1ae2fee83ebf","Type":"ContainerStarted","Data":"b52ae3e4b1c213397f6f04e8b8d1a0416f6dead161a570d9f8b86588009e2679"} Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.098740 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerStarted","Data":"f9d31b20778f758e91f3f422397acd3b8b8c5f7098632a78bd1e715e0847037f"} Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.434956 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2zwm7" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.508335 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56f22\" (UniqueName: \"kubernetes.io/projected/570d8c0d-670c-4132-85e4-e13633c3bcc2-kube-api-access-56f22\") pod \"570d8c0d-670c-4132-85e4-e13633c3bcc2\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.508375 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-scripts\") pod \"570d8c0d-670c-4132-85e4-e13633c3bcc2\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.508463 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-combined-ca-bundle\") pod \"570d8c0d-670c-4132-85e4-e13633c3bcc2\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.508494 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570d8c0d-670c-4132-85e4-e13633c3bcc2-logs\") pod \"570d8c0d-670c-4132-85e4-e13633c3bcc2\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.508523 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-config-data\") pod \"570d8c0d-670c-4132-85e4-e13633c3bcc2\" (UID: \"570d8c0d-670c-4132-85e4-e13633c3bcc2\") " Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.513307 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-scripts" (OuterVolumeSpecName: "scripts") pod "570d8c0d-670c-4132-85e4-e13633c3bcc2" (UID: "570d8c0d-670c-4132-85e4-e13633c3bcc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.520082 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/570d8c0d-670c-4132-85e4-e13633c3bcc2-logs" (OuterVolumeSpecName: "logs") pod "570d8c0d-670c-4132-85e4-e13633c3bcc2" (UID: "570d8c0d-670c-4132-85e4-e13633c3bcc2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.534273 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/570d8c0d-670c-4132-85e4-e13633c3bcc2-kube-api-access-56f22" (OuterVolumeSpecName: "kube-api-access-56f22") pod "570d8c0d-670c-4132-85e4-e13633c3bcc2" (UID: "570d8c0d-670c-4132-85e4-e13633c3bcc2"). InnerVolumeSpecName "kube-api-access-56f22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.537371 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-config-data" (OuterVolumeSpecName: "config-data") pod "570d8c0d-670c-4132-85e4-e13633c3bcc2" (UID: "570d8c0d-670c-4132-85e4-e13633c3bcc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.573559 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "570d8c0d-670c-4132-85e4-e13633c3bcc2" (UID: "570d8c0d-670c-4132-85e4-e13633c3bcc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.610651 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56f22\" (UniqueName: \"kubernetes.io/projected/570d8c0d-670c-4132-85e4-e13633c3bcc2-kube-api-access-56f22\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.610686 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.610696 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.610705 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/570d8c0d-670c-4132-85e4-e13633c3bcc2-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:03 crc kubenswrapper[4691]: I1202 08:05:03.610714 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/570d8c0d-670c-4132-85e4-e13633c3bcc2-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.114003 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-95567dd97-rcpxr" event={"ID":"22255ebb-1831-4d2e-966b-1ae2fee83ebf","Type":"ContainerStarted","Data":"3a40e9c374987a76a10f7a89080c4c1e85bd07e4f1a65b7a751b248b2e0ef9bf"} Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.114190 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.118309 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2zwm7" event={"ID":"570d8c0d-670c-4132-85e4-e13633c3bcc2","Type":"ContainerDied","Data":"93670a343c1d8fc9d43a193c025e3fa34792c131499b814e794c2cea0c425e52"} Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.118708 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93670a343c1d8fc9d43a193c025e3fa34792c131499b814e794c2cea0c425e52" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.118345 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2zwm7" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.136033 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-95567dd97-rcpxr" podStartSLOduration=3.136008977 podStartE2EDuration="3.136008977s" podCreationTimestamp="2025-12-02 08:05:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:04.132781548 +0000 UTC m=+1151.916860410" watchObservedRunningTime="2025-12-02 08:05:04.136008977 +0000 UTC m=+1151.920087839" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.251204 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-99bc7c96-6nbmb"] Dec 02 08:05:04 crc kubenswrapper[4691]: E1202 08:05:04.251607 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="570d8c0d-670c-4132-85e4-e13633c3bcc2" containerName="placement-db-sync" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.251623 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="570d8c0d-670c-4132-85e4-e13633c3bcc2" containerName="placement-db-sync" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.257102 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="570d8c0d-670c-4132-85e4-e13633c3bcc2" containerName="placement-db-sync" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.258367 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.258473 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.265281 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.265527 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.265643 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9fpvx" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.267097 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.267561 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.288170 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-99bc7c96-6nbmb"] Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.325726 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-config-data\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.325797 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-public-tls-certs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.325879 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-combined-ca-bundle\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.325911 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-scripts\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.325947 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zg9s8\" (UniqueName: \"kubernetes.io/projected/1fd0d9c6-1443-4198-8319-642b450eecb8-kube-api-access-zg9s8\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.325998 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-internal-tls-certs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.326060 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd0d9c6-1443-4198-8319-642b450eecb8-logs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.413600 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.413675 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.427806 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-config-data\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.427848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-public-tls-certs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.427907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-combined-ca-bundle\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.427932 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-scripts\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.427968 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zg9s8\" (UniqueName: \"kubernetes.io/projected/1fd0d9c6-1443-4198-8319-642b450eecb8-kube-api-access-zg9s8\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.428002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-internal-tls-certs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.428041 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd0d9c6-1443-4198-8319-642b450eecb8-logs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.428677 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1fd0d9c6-1443-4198-8319-642b450eecb8-logs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.436358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-combined-ca-bundle\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.443397 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-internal-tls-certs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.449663 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-scripts\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.453305 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zg9s8\" (UniqueName: \"kubernetes.io/projected/1fd0d9c6-1443-4198-8319-642b450eecb8-kube-api-access-zg9s8\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.458309 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-public-tls-certs\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.468522 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1fd0d9c6-1443-4198-8319-642b450eecb8-config-data\") pod \"placement-99bc7c96-6nbmb\" (UID: \"1fd0d9c6-1443-4198-8319-642b450eecb8\") " pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.577335 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.577376 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.591064 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.848162 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.849283 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.849334 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.849345 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.909874 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.923078 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.937045 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 08:05:04 crc kubenswrapper[4691]: I1202 08:05:04.958288 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.020912 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-99bc7c96-6nbmb"] Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.136921 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99bc7c96-6nbmb" event={"ID":"1fd0d9c6-1443-4198-8319-642b450eecb8","Type":"ContainerStarted","Data":"53070a1ec9f5b93c688d3ca16322a99e2393f00c12fec2cf8de17801c2b9969c"} Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.145039 4691 generic.go:334] "Generic (PLEG): container finished" podID="94701dd1-f34b-4bdc-bf74-f67799127cd5" containerID="2bcee9f6f940f70280315d05b9510f179ada01021af10d3e5e1665f8a838fe67" exitCode=0 Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.145889 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-svfbx" event={"ID":"94701dd1-f34b-4bdc-bf74-f67799127cd5","Type":"ContainerDied","Data":"2bcee9f6f940f70280315d05b9510f179ada01021af10d3e5e1665f8a838fe67"} Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.146426 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.146455 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.146588 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:05:05 crc kubenswrapper[4691]: I1202 08:05:05.147736 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.161499 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99bc7c96-6nbmb" event={"ID":"1fd0d9c6-1443-4198-8319-642b450eecb8","Type":"ContainerStarted","Data":"b2dc6c58b724c498f0fcf1f32e7b6ef80e66b8cf63a6896cc46e38078535e3dd"} Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.161835 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-99bc7c96-6nbmb" event={"ID":"1fd0d9c6-1443-4198-8319-642b450eecb8","Type":"ContainerStarted","Data":"db0fe9d5227d3fdd1bb3cc2ce2ab3cc8db5c3abfa014ed65b29e7d0fe781f974"} Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.163169 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.163231 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.202462 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-99bc7c96-6nbmb" podStartSLOduration=2.202440885 podStartE2EDuration="2.202440885s" podCreationTimestamp="2025-12-02 08:05:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:06.188276197 +0000 UTC m=+1153.972355079" watchObservedRunningTime="2025-12-02 08:05:06.202440885 +0000 UTC m=+1153.986519747" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.714195 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.892461 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk78w\" (UniqueName: \"kubernetes.io/projected/94701dd1-f34b-4bdc-bf74-f67799127cd5-kube-api-access-tk78w\") pod \"94701dd1-f34b-4bdc-bf74-f67799127cd5\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.892892 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-config-data\") pod \"94701dd1-f34b-4bdc-bf74-f67799127cd5\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.893090 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-combined-ca-bundle\") pod \"94701dd1-f34b-4bdc-bf74-f67799127cd5\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.893112 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-fernet-keys\") pod \"94701dd1-f34b-4bdc-bf74-f67799127cd5\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.893158 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-credential-keys\") pod \"94701dd1-f34b-4bdc-bf74-f67799127cd5\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.893196 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-scripts\") pod \"94701dd1-f34b-4bdc-bf74-f67799127cd5\" (UID: \"94701dd1-f34b-4bdc-bf74-f67799127cd5\") " Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.899437 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-scripts" (OuterVolumeSpecName: "scripts") pod "94701dd1-f34b-4bdc-bf74-f67799127cd5" (UID: "94701dd1-f34b-4bdc-bf74-f67799127cd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.901056 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "94701dd1-f34b-4bdc-bf74-f67799127cd5" (UID: "94701dd1-f34b-4bdc-bf74-f67799127cd5"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.902489 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94701dd1-f34b-4bdc-bf74-f67799127cd5-kube-api-access-tk78w" (OuterVolumeSpecName: "kube-api-access-tk78w") pod "94701dd1-f34b-4bdc-bf74-f67799127cd5" (UID: "94701dd1-f34b-4bdc-bf74-f67799127cd5"). InnerVolumeSpecName "kube-api-access-tk78w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.927019 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-config-data" (OuterVolumeSpecName: "config-data") pod "94701dd1-f34b-4bdc-bf74-f67799127cd5" (UID: "94701dd1-f34b-4bdc-bf74-f67799127cd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.934365 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "94701dd1-f34b-4bdc-bf74-f67799127cd5" (UID: "94701dd1-f34b-4bdc-bf74-f67799127cd5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.959250 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "94701dd1-f34b-4bdc-bf74-f67799127cd5" (UID: "94701dd1-f34b-4bdc-bf74-f67799127cd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.996060 4691 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.996097 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.996109 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk78w\" (UniqueName: \"kubernetes.io/projected/94701dd1-f34b-4bdc-bf74-f67799127cd5-kube-api-access-tk78w\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.996122 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.996131 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:06 crc kubenswrapper[4691]: I1202 08:05:06.996141 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/94701dd1-f34b-4bdc-bf74-f67799127cd5-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.072937 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.162013 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p84rj"] Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.162252 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="dnsmasq-dns" containerID="cri-o://b8db8bfef34f507c9cfa0f765ff0612b8805a7355d5e13cec8e5afceba413d61" gracePeriod=10 Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.197477 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.197503 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.197549 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-svfbx" event={"ID":"94701dd1-f34b-4bdc-bf74-f67799127cd5","Type":"ContainerDied","Data":"bacfa7dbbe55b6d395d216ecbbd5e979606a5ffd8212a8466766eb4555876d44"} Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.197605 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bacfa7dbbe55b6d395d216ecbbd5e979606a5ffd8212a8466766eb4555876d44" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.197661 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-svfbx" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.320400 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-55c7cfcf8b-8rs5b"] Dec 02 08:05:07 crc kubenswrapper[4691]: E1202 08:05:07.326592 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94701dd1-f34b-4bdc-bf74-f67799127cd5" containerName="keystone-bootstrap" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.326628 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="94701dd1-f34b-4bdc-bf74-f67799127cd5" containerName="keystone-bootstrap" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.327034 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="94701dd1-f34b-4bdc-bf74-f67799127cd5" containerName="keystone-bootstrap" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.333630 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55c7cfcf8b-8rs5b"] Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.333793 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.336604 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.337020 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-lmmt4" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.337258 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.337412 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.337689 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.337963 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.410915 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-fernet-keys\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411288 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-config-data\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411328 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-scripts\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411350 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-internal-tls-certs\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411385 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-combined-ca-bundle\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411539 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-credential-keys\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411606 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-public-tls-certs\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.411965 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z44sr\" (UniqueName: \"kubernetes.io/projected/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-kube-api-access-z44sr\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.513368 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-public-tls-certs\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.514320 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z44sr\" (UniqueName: \"kubernetes.io/projected/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-kube-api-access-z44sr\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.514920 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-fernet-keys\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.515053 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-config-data\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.515163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-scripts\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.515244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-internal-tls-certs\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.515321 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-combined-ca-bundle\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.515424 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-credential-keys\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.523307 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-public-tls-certs\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.533076 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-credential-keys\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.537313 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-config-data\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.539094 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-scripts\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.539444 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-fernet-keys\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.540079 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-combined-ca-bundle\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.541281 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z44sr\" (UniqueName: \"kubernetes.io/projected/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-kube-api-access-z44sr\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.542373 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd-internal-tls-certs\") pod \"keystone-55c7cfcf8b-8rs5b\" (UID: \"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd\") " pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:07 crc kubenswrapper[4691]: I1202 08:05:07.670022 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.217209 4691 generic.go:334] "Generic (PLEG): container finished" podID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerID="b8db8bfef34f507c9cfa0f765ff0612b8805a7355d5e13cec8e5afceba413d61" exitCode=0 Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.217306 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" event={"ID":"8956bae5-b6fb-496d-95b1-775e634fb54b","Type":"ContainerDied","Data":"b8db8bfef34f507c9cfa0f765ff0612b8805a7355d5e13cec8e5afceba413d61"} Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.616750 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.617214 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.787719 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.787936 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:05:08 crc kubenswrapper[4691]: I1202 08:05:08.791316 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 08:05:09 crc kubenswrapper[4691]: I1202 08:05:09.386106 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.483854 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.654398 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-nb\") pod \"8956bae5-b6fb-496d-95b1-775e634fb54b\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.654703 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-swift-storage-0\") pod \"8956bae5-b6fb-496d-95b1-775e634fb54b\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.654801 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-svc\") pod \"8956bae5-b6fb-496d-95b1-775e634fb54b\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.655455 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl4vk\" (UniqueName: \"kubernetes.io/projected/8956bae5-b6fb-496d-95b1-775e634fb54b-kube-api-access-jl4vk\") pod \"8956bae5-b6fb-496d-95b1-775e634fb54b\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.655553 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-config\") pod \"8956bae5-b6fb-496d-95b1-775e634fb54b\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.655604 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-sb\") pod \"8956bae5-b6fb-496d-95b1-775e634fb54b\" (UID: \"8956bae5-b6fb-496d-95b1-775e634fb54b\") " Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.673057 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8956bae5-b6fb-496d-95b1-775e634fb54b-kube-api-access-jl4vk" (OuterVolumeSpecName: "kube-api-access-jl4vk") pod "8956bae5-b6fb-496d-95b1-775e634fb54b" (UID: "8956bae5-b6fb-496d-95b1-775e634fb54b"). InnerVolumeSpecName "kube-api-access-jl4vk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.732457 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-config" (OuterVolumeSpecName: "config") pod "8956bae5-b6fb-496d-95b1-775e634fb54b" (UID: "8956bae5-b6fb-496d-95b1-775e634fb54b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.741279 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8956bae5-b6fb-496d-95b1-775e634fb54b" (UID: "8956bae5-b6fb-496d-95b1-775e634fb54b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.758623 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jl4vk\" (UniqueName: \"kubernetes.io/projected/8956bae5-b6fb-496d-95b1-775e634fb54b-kube-api-access-jl4vk\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.759163 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.759180 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.777013 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8956bae5-b6fb-496d-95b1-775e634fb54b" (UID: "8956bae5-b6fb-496d-95b1-775e634fb54b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.778488 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8956bae5-b6fb-496d-95b1-775e634fb54b" (UID: "8956bae5-b6fb-496d-95b1-775e634fb54b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.793064 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8956bae5-b6fb-496d-95b1-775e634fb54b" (UID: "8956bae5-b6fb-496d-95b1-775e634fb54b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.860802 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.860836 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:13 crc kubenswrapper[4691]: I1202 08:05:13.860850 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8956bae5-b6fb-496d-95b1-775e634fb54b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.002827 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-55c7cfcf8b-8rs5b"] Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.291335 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sh8f" event={"ID":"7f939f3c-07b4-42b8-94d9-3dbd15c03287","Type":"ContainerStarted","Data":"50ab6b0f70d3b855a58ba47bf17de058c6cf69c41cb33c8e150774441b7cdded"} Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.296023 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55c7cfcf8b-8rs5b" event={"ID":"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd","Type":"ContainerStarted","Data":"7e0c132d0c5442756d6adc79025441d0c8bc75f33c71cf78543a28c30eacbd43"} Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.296078 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-55c7cfcf8b-8rs5b" event={"ID":"f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd","Type":"ContainerStarted","Data":"551088a0261fcb9db1003624a689c75fbdfabb94adfabe9db433c5ccb980a1f0"} Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.296899 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.318468 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-7sh8f" podStartSLOduration=3.743763961 podStartE2EDuration="54.318449278s" podCreationTimestamp="2025-12-02 08:04:20 +0000 UTC" firstStartedPulling="2025-12-02 08:04:22.964433953 +0000 UTC m=+1110.748512815" lastFinishedPulling="2025-12-02 08:05:13.53911927 +0000 UTC m=+1161.323198132" observedRunningTime="2025-12-02 08:05:14.311170839 +0000 UTC m=+1162.095249711" watchObservedRunningTime="2025-12-02 08:05:14.318449278 +0000 UTC m=+1162.102528140" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.319403 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerStarted","Data":"115d58de60e7de31dab6797bc7762256d680a06f202da1d69c04f40630ab01e2"} Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.337120 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-55c7cfcf8b-8rs5b" podStartSLOduration=7.337096647 podStartE2EDuration="7.337096647s" podCreationTimestamp="2025-12-02 08:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:14.332199896 +0000 UTC m=+1162.116278768" watchObservedRunningTime="2025-12-02 08:05:14.337096647 +0000 UTC m=+1162.121175509" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.348343 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" event={"ID":"8956bae5-b6fb-496d-95b1-775e634fb54b","Type":"ContainerDied","Data":"a9b1cb461f06a2a84c670fe8f966cca2aea12a99cd3301f2ef0ca2163ef7fb25"} Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.348390 4691 scope.go:117] "RemoveContainer" containerID="b8db8bfef34f507c9cfa0f765ff0612b8805a7355d5e13cec8e5afceba413d61" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.348528 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.417239 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.477854 4691 scope.go:117] "RemoveContainer" containerID="5d22a6811808d10f8dba30dc4e421baaa132869cc560106aa1dc2f9f517f8815" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.494777 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p84rj"] Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.506296 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-p84rj"] Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.565636 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6585c7db4b-jz894" podUID="76022e0c-2dd2-4395-8607-aa13da42f557" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 02 08:05:14 crc kubenswrapper[4691]: I1202 08:05:14.576058 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" path="/var/lib/kubelet/pods/8956bae5-b6fb-496d-95b1-775e634fb54b/volumes" Dec 02 08:05:15 crc kubenswrapper[4691]: I1202 08:05:15.363330 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qz2r" event={"ID":"a217e1fe-be30-4247-91f3-020aaa089689","Type":"ContainerStarted","Data":"3463b4007cb4ad8471129346ff206ae38b9fa2426f91d358ddde396ea6ce2149"} Dec 02 08:05:15 crc kubenswrapper[4691]: I1202 08:05:15.395719 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-6qz2r" podStartSLOduration=4.830682279 podStartE2EDuration="55.395697379s" podCreationTimestamp="2025-12-02 08:04:20 +0000 UTC" firstStartedPulling="2025-12-02 08:04:22.986375042 +0000 UTC m=+1110.770453904" lastFinishedPulling="2025-12-02 08:05:13.551390132 +0000 UTC m=+1161.335469004" observedRunningTime="2025-12-02 08:05:15.382412552 +0000 UTC m=+1163.166491424" watchObservedRunningTime="2025-12-02 08:05:15.395697379 +0000 UTC m=+1163.179776241" Dec 02 08:05:17 crc kubenswrapper[4691]: I1202 08:05:17.058893 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-p84rj" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.142:5353: i/o timeout" Dec 02 08:05:17 crc kubenswrapper[4691]: I1202 08:05:17.380926 4691 generic.go:334] "Generic (PLEG): container finished" podID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" containerID="50ab6b0f70d3b855a58ba47bf17de058c6cf69c41cb33c8e150774441b7cdded" exitCode=0 Dec 02 08:05:17 crc kubenswrapper[4691]: I1202 08:05:17.380970 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sh8f" event={"ID":"7f939f3c-07b4-42b8-94d9-3dbd15c03287","Type":"ContainerDied","Data":"50ab6b0f70d3b855a58ba47bf17de058c6cf69c41cb33c8e150774441b7cdded"} Dec 02 08:05:19 crc kubenswrapper[4691]: I1202 08:05:19.403561 4691 generic.go:334] "Generic (PLEG): container finished" podID="a217e1fe-be30-4247-91f3-020aaa089689" containerID="3463b4007cb4ad8471129346ff206ae38b9fa2426f91d358ddde396ea6ce2149" exitCode=0 Dec 02 08:05:19 crc kubenswrapper[4691]: I1202 08:05:19.403642 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qz2r" event={"ID":"a217e1fe-be30-4247-91f3-020aaa089689","Type":"ContainerDied","Data":"3463b4007cb4ad8471129346ff206ae38b9fa2426f91d358ddde396ea6ce2149"} Dec 02 08:05:20 crc kubenswrapper[4691]: I1202 08:05:20.928186 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:05:20 crc kubenswrapper[4691]: I1202 08:05:20.938148 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003130 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-db-sync-config-data\") pod \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003225 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-combined-ca-bundle\") pod \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003274 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-combined-ca-bundle\") pod \"a217e1fe-be30-4247-91f3-020aaa089689\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003328 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s5hb\" (UniqueName: \"kubernetes.io/projected/a217e1fe-be30-4247-91f3-020aaa089689-kube-api-access-6s5hb\") pod \"a217e1fe-be30-4247-91f3-020aaa089689\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003398 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-scripts\") pod \"a217e1fe-be30-4247-91f3-020aaa089689\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003426 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-db-sync-config-data\") pod \"a217e1fe-be30-4247-91f3-020aaa089689\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003449 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-config-data\") pod \"a217e1fe-be30-4247-91f3-020aaa089689\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003495 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a217e1fe-be30-4247-91f3-020aaa089689-etc-machine-id\") pod \"a217e1fe-be30-4247-91f3-020aaa089689\" (UID: \"a217e1fe-be30-4247-91f3-020aaa089689\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.003572 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6xrr\" (UniqueName: \"kubernetes.io/projected/7f939f3c-07b4-42b8-94d9-3dbd15c03287-kube-api-access-b6xrr\") pod \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\" (UID: \"7f939f3c-07b4-42b8-94d9-3dbd15c03287\") " Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.004934 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a217e1fe-be30-4247-91f3-020aaa089689-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "a217e1fe-be30-4247-91f3-020aaa089689" (UID: "a217e1fe-be30-4247-91f3-020aaa089689"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.009392 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7f939f3c-07b4-42b8-94d9-3dbd15c03287" (UID: "7f939f3c-07b4-42b8-94d9-3dbd15c03287"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.010267 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a217e1fe-be30-4247-91f3-020aaa089689" (UID: "a217e1fe-be30-4247-91f3-020aaa089689"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.011724 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a217e1fe-be30-4247-91f3-020aaa089689-kube-api-access-6s5hb" (OuterVolumeSpecName: "kube-api-access-6s5hb") pod "a217e1fe-be30-4247-91f3-020aaa089689" (UID: "a217e1fe-be30-4247-91f3-020aaa089689"). InnerVolumeSpecName "kube-api-access-6s5hb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.011796 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-scripts" (OuterVolumeSpecName: "scripts") pod "a217e1fe-be30-4247-91f3-020aaa089689" (UID: "a217e1fe-be30-4247-91f3-020aaa089689"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.029041 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f939f3c-07b4-42b8-94d9-3dbd15c03287-kube-api-access-b6xrr" (OuterVolumeSpecName: "kube-api-access-b6xrr") pod "7f939f3c-07b4-42b8-94d9-3dbd15c03287" (UID: "7f939f3c-07b4-42b8-94d9-3dbd15c03287"). InnerVolumeSpecName "kube-api-access-b6xrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.095949 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a217e1fe-be30-4247-91f3-020aaa089689" (UID: "a217e1fe-be30-4247-91f3-020aaa089689"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.095949 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f939f3c-07b4-42b8-94d9-3dbd15c03287" (UID: "7f939f3c-07b4-42b8-94d9-3dbd15c03287"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110473 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110533 4691 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a217e1fe-be30-4247-91f3-020aaa089689-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110584 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6xrr\" (UniqueName: \"kubernetes.io/projected/7f939f3c-07b4-42b8-94d9-3dbd15c03287-kube-api-access-b6xrr\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110598 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110608 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f939f3c-07b4-42b8-94d9-3dbd15c03287-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110618 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110627 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s5hb\" (UniqueName: \"kubernetes.io/projected/a217e1fe-be30-4247-91f3-020aaa089689-kube-api-access-6s5hb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.110637 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.141564 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-config-data" (OuterVolumeSpecName: "config-data") pod "a217e1fe-be30-4247-91f3-020aaa089689" (UID: "a217e1fe-be30-4247-91f3-020aaa089689"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.215532 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a217e1fe-be30-4247-91f3-020aaa089689-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.425839 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-7sh8f" event={"ID":"7f939f3c-07b4-42b8-94d9-3dbd15c03287","Type":"ContainerDied","Data":"0fc87a782fb450edea5abb46c422787bbcf66518b0ba5a31e6ac0da512349fb5"} Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.425884 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fc87a782fb450edea5abb46c422787bbcf66518b0ba5a31e6ac0da512349fb5" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.425855 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-7sh8f" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.427843 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-6qz2r" event={"ID":"a217e1fe-be30-4247-91f3-020aaa089689","Type":"ContainerDied","Data":"9a40aacf676cc0605aa54c512cbfa611437da37fba3978e49712ae02b734e29c"} Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.427886 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a40aacf676cc0605aa54c512cbfa611437da37fba3978e49712ae02b734e29c" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.427908 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-6qz2r" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.702400 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:21 crc kubenswrapper[4691]: E1202 08:05:21.703133 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" containerName="barbican-db-sync" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703146 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" containerName="barbican-db-sync" Dec 02 08:05:21 crc kubenswrapper[4691]: E1202 08:05:21.703166 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a217e1fe-be30-4247-91f3-020aaa089689" containerName="cinder-db-sync" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703198 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a217e1fe-be30-4247-91f3-020aaa089689" containerName="cinder-db-sync" Dec 02 08:05:21 crc kubenswrapper[4691]: E1202 08:05:21.703218 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="dnsmasq-dns" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703224 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="dnsmasq-dns" Dec 02 08:05:21 crc kubenswrapper[4691]: E1202 08:05:21.703243 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="init" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703249 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="init" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703426 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a217e1fe-be30-4247-91f3-020aaa089689" containerName="cinder-db-sync" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703440 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8956bae5-b6fb-496d-95b1-775e634fb54b" containerName="dnsmasq-dns" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.703457 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" containerName="barbican-db-sync" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.704680 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.708446 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.708647 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.710082 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fxlkj" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.716336 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.722352 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.787445 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b895b5785-t65tl"] Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.794074 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.814295 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-t65tl"] Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.827710 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjmgj\" (UniqueName: \"kubernetes.io/projected/e29c75f3-7434-4bf1-a23e-818f8d998754-kube-api-access-hjmgj\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.827824 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-scripts\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.827878 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.827907 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.827934 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29c75f3-7434-4bf1-a23e-818f8d998754-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.827994 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929438 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29c75f3-7434-4bf1-a23e-818f8d998754-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929562 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-svc\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929593 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929648 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjmgj\" (UniqueName: \"kubernetes.io/projected/e29c75f3-7434-4bf1-a23e-818f8d998754-kube-api-access-hjmgj\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929679 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929712 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929749 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwqw\" (UniqueName: \"kubernetes.io/projected/952f1681-e30a-41fc-9d4e-17d2685e1776-kube-api-access-nwwqw\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929827 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-scripts\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929863 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929918 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929949 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-config\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.929988 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.930700 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29c75f3-7434-4bf1-a23e-818f8d998754-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.947969 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.949782 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-scripts\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.960278 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjmgj\" (UniqueName: \"kubernetes.io/projected/e29c75f3-7434-4bf1-a23e-818f8d998754-kube-api-access-hjmgj\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.970659 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:21 crc kubenswrapper[4691]: I1202 08:05:21.976550 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.030840 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.032581 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-svc\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.032656 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.032687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.032712 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwqw\" (UniqueName: \"kubernetes.io/projected/952f1681-e30a-41fc-9d4e-17d2685e1776-kube-api-access-nwwqw\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.032746 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.032800 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-config\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.033827 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-config\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.033828 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-svc\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.034340 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-sb\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.034507 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-nb\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.036515 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-swift-storage-0\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.045925 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.048394 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.055359 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.059549 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.071721 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwqw\" (UniqueName: \"kubernetes.io/projected/952f1681-e30a-41fc-9d4e-17d2685e1776-kube-api-access-nwwqw\") pod \"dnsmasq-dns-b895b5785-t65tl\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.121266 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136310 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136448 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-scripts\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136484 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xxdz\" (UniqueName: \"kubernetes.io/projected/41935cdd-b285-4d02-b181-9675b1570b83-kube-api-access-9xxdz\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136513 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data-custom\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136617 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136670 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41935cdd-b285-4d02-b181-9675b1570b83-logs\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.136722 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41935cdd-b285-4d02-b181-9675b1570b83-etc-machine-id\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238180 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238578 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-scripts\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238618 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xxdz\" (UniqueName: \"kubernetes.io/projected/41935cdd-b285-4d02-b181-9675b1570b83-kube-api-access-9xxdz\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238647 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data-custom\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238700 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238773 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41935cdd-b285-4d02-b181-9675b1570b83-logs\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238828 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41935cdd-b285-4d02-b181-9675b1570b83-etc-machine-id\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.238980 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41935cdd-b285-4d02-b181-9675b1570b83-etc-machine-id\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.249219 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41935cdd-b285-4d02-b181-9675b1570b83-logs\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.254496 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.258088 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-scripts\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.258608 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data-custom\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.271216 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.302132 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xxdz\" (UniqueName: \"kubernetes.io/projected/41935cdd-b285-4d02-b181-9675b1570b83-kube-api-access-9xxdz\") pod \"cinder-api-0\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.303412 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-57597556c5-xzp56"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.312848 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.325702 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.325817 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-skzl9" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.326435 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.338942 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57597556c5-xzp56"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.356685 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-598b69485d-n2fl7"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.358398 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.364688 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.382560 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-598b69485d-n2fl7"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.429827 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444341 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-config-data\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444380 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13685c-ff97-4074-8bc8-5659d16ec95d-logs\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444417 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxc8q\" (UniqueName: \"kubernetes.io/projected/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-kube-api-access-kxc8q\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444448 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-combined-ca-bundle\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444473 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ttgg\" (UniqueName: \"kubernetes.io/projected/3c13685c-ff97-4074-8bc8-5659d16ec95d-kube-api-access-9ttgg\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444495 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-config-data-custom\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444520 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-config-data-custom\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444540 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-logs\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444566 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-config-data\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.444675 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-combined-ca-bundle\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.484631 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-t65tl"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.505863 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v65fc"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.509274 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.528986 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v65fc"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.547831 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.547906 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6l7\" (UniqueName: \"kubernetes.io/projected/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-kube-api-access-6v6l7\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.547969 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548010 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-config\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548049 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-config-data\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548078 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13685c-ff97-4074-8bc8-5659d16ec95d-logs\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548114 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kxc8q\" (UniqueName: \"kubernetes.io/projected/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-kube-api-access-kxc8q\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548152 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-combined-ca-bundle\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548180 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ttgg\" (UniqueName: \"kubernetes.io/projected/3c13685c-ff97-4074-8bc8-5659d16ec95d-kube-api-access-9ttgg\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548205 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-config-data-custom\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548226 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-config-data-custom\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548297 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-logs\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548333 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-config-data\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548373 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-combined-ca-bundle\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.548394 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.552579 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-logs\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.558585 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c13685c-ff97-4074-8bc8-5659d16ec95d-logs\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.565452 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-config-data\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.567446 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-combined-ca-bundle\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.585999 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-combined-ca-bundle\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.709239 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-config-data\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.725194 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3c13685c-ff97-4074-8bc8-5659d16ec95d-config-data-custom\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.726725 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-config-data-custom\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.735436 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.735707 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.735811 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.735836 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6l7\" (UniqueName: \"kubernetes.io/projected/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-kube-api-access-6v6l7\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.735964 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.736083 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-config\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.737492 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxc8q\" (UniqueName: \"kubernetes.io/projected/d855ba5e-92ab-4a5e-b613-f49c9fec44b1-kube-api-access-kxc8q\") pod \"barbican-keystone-listener-598b69485d-n2fl7\" (UID: \"d855ba5e-92ab-4a5e-b613-f49c9fec44b1\") " pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.739530 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.742169 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.744673 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ttgg\" (UniqueName: \"kubernetes.io/projected/3c13685c-ff97-4074-8bc8-5659d16ec95d-kube-api-access-9ttgg\") pod \"barbican-worker-57597556c5-xzp56\" (UID: \"3c13685c-ff97-4074-8bc8-5659d16ec95d\") " pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.747273 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.747932 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.750010 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-config\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.767946 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.770913 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6l7\" (UniqueName: \"kubernetes.io/projected/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-kube-api-access-6v6l7\") pod \"dnsmasq-dns-5c9776ccc5-v65fc\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.786012 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56b46d4bbb-w8vmj"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.787990 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56b46d4bbb-w8vmj"] Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.788104 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.795224 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.852506 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.949298 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-logs\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.949413 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-combined-ca-bundle\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.949448 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data-custom\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.949598 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xn6v\" (UniqueName: \"kubernetes.io/projected/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-kube-api-access-9xn6v\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.949654 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:22 crc kubenswrapper[4691]: I1202 08:05:22.973852 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-57597556c5-xzp56" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.051738 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-logs\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.052149 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-combined-ca-bundle\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.052181 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data-custom\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.052275 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-logs\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.052285 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xn6v\" (UniqueName: \"kubernetes.io/projected/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-kube-api-access-9xn6v\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.052423 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.057561 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data-custom\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.058362 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.075439 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-combined-ca-bundle\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.077869 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xn6v\" (UniqueName: \"kubernetes.io/projected/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-kube-api-access-9xn6v\") pod \"barbican-api-56b46d4bbb-w8vmj\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:23 crc kubenswrapper[4691]: I1202 08:05:23.127236 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.233910 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.415164 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.563912 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6585c7db4b-jz894" podUID="76022e0c-2dd2-4395-8607-aa13da42f557" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.667964 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.678326 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v65fc"] Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.873041 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-598b69485d-n2fl7"] Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.894534 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-t65tl"] Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.912318 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-57597556c5-xzp56"] Dec 02 08:05:24 crc kubenswrapper[4691]: I1202 08:05:24.970101 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.098179 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56b46d4bbb-w8vmj"] Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.530990 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57597556c5-xzp56" event={"ID":"3c13685c-ff97-4074-8bc8-5659d16ec95d","Type":"ContainerStarted","Data":"73cf5a385d0c09c9deb74e3492bcae5ad13c9ae311128c094fe5c2337a2e1a62"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.562187 4691 generic.go:334] "Generic (PLEG): container finished" podID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerID="995361b04d06c767d40cd78fcd5e258a55f7f5f1ecba12fb8cb1e797bc072476" exitCode=0 Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.562255 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" event={"ID":"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3","Type":"ContainerDied","Data":"995361b04d06c767d40cd78fcd5e258a55f7f5f1ecba12fb8cb1e797bc072476"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.562276 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" event={"ID":"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3","Type":"ContainerStarted","Data":"f9c7d80f8071a1a9b476478d0ba7bd73928db2f4c9ffe6f55a82918e3334b507"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.571225 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"41935cdd-b285-4d02-b181-9675b1570b83","Type":"ContainerStarted","Data":"016f291c6334b761af4a0205c006d606f6407c5093ae10312d24088c59697d8d"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.590337 4691 generic.go:334] "Generic (PLEG): container finished" podID="952f1681-e30a-41fc-9d4e-17d2685e1776" containerID="49db2e8d84ee9a973a78f44625eeaa23c76e3c240190e07fb74bf34e72842881" exitCode=0 Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.590443 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-t65tl" event={"ID":"952f1681-e30a-41fc-9d4e-17d2685e1776","Type":"ContainerDied","Data":"49db2e8d84ee9a973a78f44625eeaa23c76e3c240190e07fb74bf34e72842881"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.590478 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-t65tl" event={"ID":"952f1681-e30a-41fc-9d4e-17d2685e1776","Type":"ContainerStarted","Data":"84113b9f9614a9c780e7eb9fe625688a71fc41e2da7438e95d9db914c39cb1af"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.600170 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerStarted","Data":"7d2275e4170bb9ead85515498234bdace94b051a083ba93fafd25980fadfcf7e"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.600392 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-central-agent" containerID="cri-o://e69205269ae58f6dcbb3393c3e2d577675c6f416145c65fe9bb77fe4cfbb3d48" gracePeriod=30 Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.600811 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.601376 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="sg-core" containerID="cri-o://115d58de60e7de31dab6797bc7762256d680a06f202da1d69c04f40630ab01e2" gracePeriod=30 Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.601483 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="proxy-httpd" containerID="cri-o://7d2275e4170bb9ead85515498234bdace94b051a083ba93fafd25980fadfcf7e" gracePeriod=30 Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.601534 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-notification-agent" containerID="cri-o://f9d31b20778f758e91f3f422397acd3b8b8c5f7098632a78bd1e715e0847037f" gracePeriod=30 Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.605658 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b46d4bbb-w8vmj" event={"ID":"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef","Type":"ContainerStarted","Data":"c6495d7a8e820854aaed8fdebfe3bef2ad068886c18ffc5ab9fec81e65cdc7f9"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.605693 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b46d4bbb-w8vmj" event={"ID":"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef","Type":"ContainerStarted","Data":"66fd4864b68e7c0bf7e09c2b400e350fcde7956ffa7c102049d4dab1a9720d3b"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.607110 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29c75f3-7434-4bf1-a23e-818f8d998754","Type":"ContainerStarted","Data":"24d2215bbf01aaed09e3a03b70b6a30dc8889f5db91d9be7d0460d248bec2b1f"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.610472 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" event={"ID":"d855ba5e-92ab-4a5e-b613-f49c9fec44b1","Type":"ContainerStarted","Data":"e8112b96194b8b05e5cac131f3b7fe41449913591c1164e19cb410e4095f8042"} Dec 02 08:05:25 crc kubenswrapper[4691]: I1202 08:05:25.746629 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.007213919 podStartE2EDuration="1m4.746602401s" podCreationTimestamp="2025-12-02 08:04:21 +0000 UTC" firstStartedPulling="2025-12-02 08:04:23.163711351 +0000 UTC m=+1110.947790213" lastFinishedPulling="2025-12-02 08:05:23.903099833 +0000 UTC m=+1171.687178695" observedRunningTime="2025-12-02 08:05:25.717905696 +0000 UTC m=+1173.501984568" watchObservedRunningTime="2025-12-02 08:05:25.746602401 +0000 UTC m=+1173.530681263" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.291290 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.312629 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-config\") pod \"952f1681-e30a-41fc-9d4e-17d2685e1776\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.312750 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-nb\") pod \"952f1681-e30a-41fc-9d4e-17d2685e1776\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.312835 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-sb\") pod \"952f1681-e30a-41fc-9d4e-17d2685e1776\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.313123 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-svc\") pod \"952f1681-e30a-41fc-9d4e-17d2685e1776\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.313161 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwwqw\" (UniqueName: \"kubernetes.io/projected/952f1681-e30a-41fc-9d4e-17d2685e1776-kube-api-access-nwwqw\") pod \"952f1681-e30a-41fc-9d4e-17d2685e1776\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.313186 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-swift-storage-0\") pod \"952f1681-e30a-41fc-9d4e-17d2685e1776\" (UID: \"952f1681-e30a-41fc-9d4e-17d2685e1776\") " Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.349404 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/952f1681-e30a-41fc-9d4e-17d2685e1776-kube-api-access-nwwqw" (OuterVolumeSpecName: "kube-api-access-nwwqw") pod "952f1681-e30a-41fc-9d4e-17d2685e1776" (UID: "952f1681-e30a-41fc-9d4e-17d2685e1776"). InnerVolumeSpecName "kube-api-access-nwwqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.396560 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "952f1681-e30a-41fc-9d4e-17d2685e1776" (UID: "952f1681-e30a-41fc-9d4e-17d2685e1776"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.399535 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-config" (OuterVolumeSpecName: "config") pod "952f1681-e30a-41fc-9d4e-17d2685e1776" (UID: "952f1681-e30a-41fc-9d4e-17d2685e1776"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.401458 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "952f1681-e30a-41fc-9d4e-17d2685e1776" (UID: "952f1681-e30a-41fc-9d4e-17d2685e1776"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.417278 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.417337 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwwqw\" (UniqueName: \"kubernetes.io/projected/952f1681-e30a-41fc-9d4e-17d2685e1776-kube-api-access-nwwqw\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.417351 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.417362 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.421377 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "952f1681-e30a-41fc-9d4e-17d2685e1776" (UID: "952f1681-e30a-41fc-9d4e-17d2685e1776"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.424588 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "952f1681-e30a-41fc-9d4e-17d2685e1776" (UID: "952f1681-e30a-41fc-9d4e-17d2685e1776"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.519222 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.519257 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/952f1681-e30a-41fc-9d4e-17d2685e1776-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.631967 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b895b5785-t65tl" event={"ID":"952f1681-e30a-41fc-9d4e-17d2685e1776","Type":"ContainerDied","Data":"84113b9f9614a9c780e7eb9fe625688a71fc41e2da7438e95d9db914c39cb1af"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.632026 4691 scope.go:117] "RemoveContainer" containerID="49db2e8d84ee9a973a78f44625eeaa23c76e3c240190e07fb74bf34e72842881" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.632234 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b895b5785-t65tl" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.643140 4691 generic.go:334] "Generic (PLEG): container finished" podID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerID="7d2275e4170bb9ead85515498234bdace94b051a083ba93fafd25980fadfcf7e" exitCode=0 Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.643174 4691 generic.go:334] "Generic (PLEG): container finished" podID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerID="115d58de60e7de31dab6797bc7762256d680a06f202da1d69c04f40630ab01e2" exitCode=2 Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.643185 4691 generic.go:334] "Generic (PLEG): container finished" podID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerID="e69205269ae58f6dcbb3393c3e2d577675c6f416145c65fe9bb77fe4cfbb3d48" exitCode=0 Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.643206 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerDied","Data":"7d2275e4170bb9ead85515498234bdace94b051a083ba93fafd25980fadfcf7e"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.643306 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerDied","Data":"115d58de60e7de31dab6797bc7762256d680a06f202da1d69c04f40630ab01e2"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.643324 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerDied","Data":"e69205269ae58f6dcbb3393c3e2d577675c6f416145c65fe9bb77fe4cfbb3d48"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.645722 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b46d4bbb-w8vmj" event={"ID":"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef","Type":"ContainerStarted","Data":"8dd835aeec3a6b54f83f39a520ec4721c5cafac504197fd018ade1ecbfb91a73"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.645902 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.649429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" event={"ID":"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3","Type":"ContainerStarted","Data":"7041b9f35263279ee82e63a63843ff21479f24fc00cc9f5399f091c0bc88c62c"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.649588 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.651814 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"41935cdd-b285-4d02-b181-9675b1570b83","Type":"ContainerStarted","Data":"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d"} Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.681730 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56b46d4bbb-w8vmj" podStartSLOduration=4.681704518 podStartE2EDuration="4.681704518s" podCreationTimestamp="2025-12-02 08:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:26.668857302 +0000 UTC m=+1174.452936164" watchObservedRunningTime="2025-12-02 08:05:26.681704518 +0000 UTC m=+1174.465783380" Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.718299 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-t65tl"] Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.724874 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b895b5785-t65tl"] Dec 02 08:05:26 crc kubenswrapper[4691]: I1202 08:05:26.734472 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" podStartSLOduration=4.734450685 podStartE2EDuration="4.734450685s" podCreationTimestamp="2025-12-02 08:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:26.729304958 +0000 UTC m=+1174.513383830" watchObservedRunningTime="2025-12-02 08:05:26.734450685 +0000 UTC m=+1174.518529547" Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.149035 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.664380 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"41935cdd-b285-4d02-b181-9675b1570b83","Type":"ContainerStarted","Data":"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743"} Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.664851 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api-log" containerID="cri-o://48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d" gracePeriod=30 Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.665470 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.665834 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api" containerID="cri-o://48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743" gracePeriod=30 Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.679060 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29c75f3-7434-4bf1-a23e-818f8d998754","Type":"ContainerStarted","Data":"da54da10259eaa22d6d8c15d99233d36b56b2ebfd4eaeb3b712b5e6854ccf7d9"} Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.679377 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:27 crc kubenswrapper[4691]: I1202 08:05:27.700410 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.70038784 podStartE2EDuration="5.70038784s" podCreationTimestamp="2025-12-02 08:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:27.690071626 +0000 UTC m=+1175.474150488" watchObservedRunningTime="2025-12-02 08:05:27.70038784 +0000 UTC m=+1175.484466702" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.508613 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-86bc664f5b-6lklq"] Dec 02 08:05:28 crc kubenswrapper[4691]: E1202 08:05:28.509871 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="952f1681-e30a-41fc-9d4e-17d2685e1776" containerName="init" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.509894 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="952f1681-e30a-41fc-9d4e-17d2685e1776" containerName="init" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.510141 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="952f1681-e30a-41fc-9d4e-17d2685e1776" containerName="init" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.511400 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86bc664f5b-6lklq"] Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.511493 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.518829 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.519063 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.610719 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-combined-ca-bundle\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.610788 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-config-data\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.610826 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-logs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.610885 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-config-data-custom\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.611027 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-public-tls-certs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.611067 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-internal-tls-certs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.611118 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjtzh\" (UniqueName: \"kubernetes.io/projected/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-kube-api-access-fjtzh\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.616451 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.664662 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="952f1681-e30a-41fc-9d4e-17d2685e1776" path="/var/lib/kubelet/pods/952f1681-e30a-41fc-9d4e-17d2685e1776/volumes" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.713497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-scripts\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.713813 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41935cdd-b285-4d02-b181-9675b1570b83-logs\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.713917 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data-custom\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714028 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-combined-ca-bundle\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714056 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714091 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41935cdd-b285-4d02-b181-9675b1570b83-etc-machine-id\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714118 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xxdz\" (UniqueName: \"kubernetes.io/projected/41935cdd-b285-4d02-b181-9675b1570b83-kube-api-access-9xxdz\") pod \"41935cdd-b285-4d02-b181-9675b1570b83\" (UID: \"41935cdd-b285-4d02-b181-9675b1570b83\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714362 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-config-data-custom\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714448 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-public-tls-certs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714472 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-internal-tls-certs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714515 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjtzh\" (UniqueName: \"kubernetes.io/projected/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-kube-api-access-fjtzh\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714594 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-combined-ca-bundle\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714617 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-config-data\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.714636 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-logs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.715143 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-logs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.720192 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.720276 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/41935cdd-b285-4d02-b181-9675b1570b83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.720083 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41935cdd-b285-4d02-b181-9675b1570b83-logs" (OuterVolumeSpecName: "logs") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729523 4691 generic.go:334] "Generic (PLEG): container finished" podID="41935cdd-b285-4d02-b181-9675b1570b83" containerID="48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743" exitCode=0 Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729547 4691 generic.go:334] "Generic (PLEG): container finished" podID="41935cdd-b285-4d02-b181-9675b1570b83" containerID="48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d" exitCode=143 Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729598 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729627 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"41935cdd-b285-4d02-b181-9675b1570b83","Type":"ContainerDied","Data":"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729660 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"41935cdd-b285-4d02-b181-9675b1570b83","Type":"ContainerDied","Data":"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729671 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"41935cdd-b285-4d02-b181-9675b1570b83","Type":"ContainerDied","Data":"016f291c6334b761af4a0205c006d606f6407c5093ae10312d24088c59697d8d"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.729692 4691 scope.go:117] "RemoveContainer" containerID="48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.735404 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.739985 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-config-data\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.740267 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-public-tls-certs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.740635 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41935cdd-b285-4d02-b181-9675b1570b83-kube-api-access-9xxdz" (OuterVolumeSpecName: "kube-api-access-9xxdz") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "kube-api-access-9xxdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.740915 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-combined-ca-bundle\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.742796 4691 generic.go:334] "Generic (PLEG): container finished" podID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerID="f9d31b20778f758e91f3f422397acd3b8b8c5f7098632a78bd1e715e0847037f" exitCode=0 Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.742872 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerDied","Data":"f9d31b20778f758e91f3f422397acd3b8b8c5f7098632a78bd1e715e0847037f"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.744266 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-internal-tls-certs\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.747138 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjtzh\" (UniqueName: \"kubernetes.io/projected/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-kube-api-access-fjtzh\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.749286 4691 generic.go:334] "Generic (PLEG): container finished" podID="24910bce-2ac5-4966-af8e-48dad2b11370" containerID="4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667" exitCode=137 Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.749328 4691 generic.go:334] "Generic (PLEG): container finished" podID="24910bce-2ac5-4966-af8e-48dad2b11370" containerID="f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74" exitCode=137 Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.749450 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7c4b8f5f8f-l7b75" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.749639 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4b8f5f8f-l7b75" event={"ID":"24910bce-2ac5-4966-af8e-48dad2b11370","Type":"ContainerDied","Data":"4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.749707 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7c4b8f5f8f-l7b75" event={"ID":"24910bce-2ac5-4966-af8e-48dad2b11370","Type":"ContainerDied","Data":"f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.753614 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff11f2ca-96b6-4cd2-85b8-88916b74efc7-config-data-custom\") pod \"barbican-api-86bc664f5b-6lklq\" (UID: \"ff11f2ca-96b6-4cd2-85b8-88916b74efc7\") " pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.757139 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-scripts" (OuterVolumeSpecName: "scripts") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.768459 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57597556c5-xzp56" event={"ID":"3c13685c-ff97-4074-8bc8-5659d16ec95d","Type":"ContainerStarted","Data":"8e622935fc5cf53c7b25e2e8a774693b164998c25ed5d35a7aec0b24a4c6b69b"} Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.772815 4691 scope.go:117] "RemoveContainer" containerID="48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.781258 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.786591 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.813443 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data" (OuterVolumeSpecName: "config-data") pod "41935cdd-b285-4d02-b181-9675b1570b83" (UID: "41935cdd-b285-4d02-b181-9675b1570b83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816124 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-config-data\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816264 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-sg-core-conf-yaml\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816366 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-scripts\") pod \"24910bce-2ac5-4966-af8e-48dad2b11370\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816405 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-log-httpd\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816436 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-scripts\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816483 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24910bce-2ac5-4966-af8e-48dad2b11370-horizon-secret-key\") pod \"24910bce-2ac5-4966-af8e-48dad2b11370\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816517 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wbpv\" (UniqueName: \"kubernetes.io/projected/24910bce-2ac5-4966-af8e-48dad2b11370-kube-api-access-8wbpv\") pod \"24910bce-2ac5-4966-af8e-48dad2b11370\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816831 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-config-data\") pod \"24910bce-2ac5-4966-af8e-48dad2b11370\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816887 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24910bce-2ac5-4966-af8e-48dad2b11370-logs\") pod \"24910bce-2ac5-4966-af8e-48dad2b11370\" (UID: \"24910bce-2ac5-4966-af8e-48dad2b11370\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816915 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-combined-ca-bundle\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.816938 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2qd7\" (UniqueName: \"kubernetes.io/projected/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-kube-api-access-x2qd7\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817010 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-run-httpd\") pod \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\" (UID: \"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c\") " Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817445 4691 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817462 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817474 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817484 4691 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/41935cdd-b285-4d02-b181-9675b1570b83-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817493 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xxdz\" (UniqueName: \"kubernetes.io/projected/41935cdd-b285-4d02-b181-9675b1570b83-kube-api-access-9xxdz\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817503 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/41935cdd-b285-4d02-b181-9675b1570b83-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817512 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/41935cdd-b285-4d02-b181-9675b1570b83-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.817980 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24910bce-2ac5-4966-af8e-48dad2b11370-logs" (OuterVolumeSpecName: "logs") pod "24910bce-2ac5-4966-af8e-48dad2b11370" (UID: "24910bce-2ac5-4966-af8e-48dad2b11370"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.819589 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.823289 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.827481 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24910bce-2ac5-4966-af8e-48dad2b11370-kube-api-access-8wbpv" (OuterVolumeSpecName: "kube-api-access-8wbpv") pod "24910bce-2ac5-4966-af8e-48dad2b11370" (UID: "24910bce-2ac5-4966-af8e-48dad2b11370"). InnerVolumeSpecName "kube-api-access-8wbpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.829811 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24910bce-2ac5-4966-af8e-48dad2b11370-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "24910bce-2ac5-4966-af8e-48dad2b11370" (UID: "24910bce-2ac5-4966-af8e-48dad2b11370"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.830165 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-kube-api-access-x2qd7" (OuterVolumeSpecName: "kube-api-access-x2qd7") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "kube-api-access-x2qd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.847096 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-scripts" (OuterVolumeSpecName: "scripts") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.847080 4691 scope.go:117] "RemoveContainer" containerID="48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743" Dec 02 08:05:28 crc kubenswrapper[4691]: E1202 08:05:28.851311 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743\": container with ID starting with 48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743 not found: ID does not exist" containerID="48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.851355 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743"} err="failed to get container status \"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743\": rpc error: code = NotFound desc = could not find container \"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743\": container with ID starting with 48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743 not found: ID does not exist" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.851381 4691 scope.go:117] "RemoveContainer" containerID="48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d" Dec 02 08:05:28 crc kubenswrapper[4691]: E1202 08:05:28.852384 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d\": container with ID starting with 48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d not found: ID does not exist" containerID="48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.852424 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d"} err="failed to get container status \"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d\": rpc error: code = NotFound desc = could not find container \"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d\": container with ID starting with 48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d not found: ID does not exist" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.852456 4691 scope.go:117] "RemoveContainer" containerID="48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.853613 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743"} err="failed to get container status \"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743\": rpc error: code = NotFound desc = could not find container \"48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743\": container with ID starting with 48170dc4871cd125dddfb3a640a7a872b25bede98aad59432875b4e555202743 not found: ID does not exist" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.853637 4691 scope.go:117] "RemoveContainer" containerID="48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.854486 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d"} err="failed to get container status \"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d\": rpc error: code = NotFound desc = could not find container \"48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d\": container with ID starting with 48a758bbd470bad9fece90ce633a58e3ad4f9e6468561385e5252e21bfbef98d not found: ID does not exist" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.854507 4691 scope.go:117] "RemoveContainer" containerID="4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.888181 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.891709 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-scripts" (OuterVolumeSpecName: "scripts") pod "24910bce-2ac5-4966-af8e-48dad2b11370" (UID: "24910bce-2ac5-4966-af8e-48dad2b11370"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.902783 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-config-data" (OuterVolumeSpecName: "config-data") pod "24910bce-2ac5-4966-af8e-48dad2b11370" (UID: "24910bce-2ac5-4966-af8e-48dad2b11370"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920328 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920361 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/24910bce-2ac5-4966-af8e-48dad2b11370-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920373 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wbpv\" (UniqueName: \"kubernetes.io/projected/24910bce-2ac5-4966-af8e-48dad2b11370-kube-api-access-8wbpv\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920384 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920398 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24910bce-2ac5-4966-af8e-48dad2b11370-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920410 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2qd7\" (UniqueName: \"kubernetes.io/projected/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-kube-api-access-x2qd7\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920423 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920434 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920446 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24910bce-2ac5-4966-af8e-48dad2b11370-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.920456 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.948879 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.958570 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:28 crc kubenswrapper[4691]: I1202 08:05:28.993173 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-config-data" (OuterVolumeSpecName: "config-data") pod "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" (UID: "7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.022099 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.022147 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.085529 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.110128 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.110168 4691 scope.go:117] "RemoveContainer" containerID="f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.145521 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146011 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-notification-agent" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146027 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-notification-agent" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146072 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146079 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146092 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon-log" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146098 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon-log" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146112 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api-log" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146119 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api-log" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146135 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="sg-core" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146141 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="sg-core" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146156 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-central-agent" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146162 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-central-agent" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146177 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146184 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.146199 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="proxy-httpd" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146205 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="proxy-httpd" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146436 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-central-agent" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146459 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="ceilometer-notification-agent" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146472 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="sg-core" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146485 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146500 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" containerName="proxy-httpd" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146511 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" containerName="horizon-log" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146525 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api-log" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.146538 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="41935cdd-b285-4d02-b181-9675b1570b83" containerName="cinder-api" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.148079 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.151342 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.151694 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.151734 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.158631 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7c4b8f5f8f-l7b75"] Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.172884 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7c4b8f5f8f-l7b75"] Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.181726 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.225145 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.225252 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3d72e5-726a-4f4b-a677-6237021e8747-logs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.225278 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-scripts\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.225307 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-config-data\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.225340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhp6f\" (UniqueName: \"kubernetes.io/projected/0e3d72e5-726a-4f4b-a677-6237021e8747-kube-api-access-lhp6f\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.226131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e3d72e5-726a-4f4b-a677-6237021e8747-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.226168 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.226417 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.226538 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.291485 4691 scope.go:117] "RemoveContainer" containerID="4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.292117 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667\": container with ID starting with 4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667 not found: ID does not exist" containerID="4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.292144 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667"} err="failed to get container status \"4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667\": rpc error: code = NotFound desc = could not find container \"4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667\": container with ID starting with 4ddd7ff08fd9b3e3a5c7bd62e0add179bf0635e205713df88519d37081b14667 not found: ID does not exist" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.292167 4691 scope.go:117] "RemoveContainer" containerID="f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74" Dec 02 08:05:29 crc kubenswrapper[4691]: E1202 08:05:29.292496 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74\": container with ID starting with f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74 not found: ID does not exist" containerID="f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.292513 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74"} err="failed to get container status \"f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74\": rpc error: code = NotFound desc = could not find container \"f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74\": container with ID starting with f6769a234a43fe440177293574bbe35b09d29a43efa67c95be49ab4c2b278d74 not found: ID does not exist" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.327783 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e3d72e5-726a-4f4b-a677-6237021e8747-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328073 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328153 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328179 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328220 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328301 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3d72e5-726a-4f4b-a677-6237021e8747-logs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328334 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-scripts\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328360 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-config-data\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.328400 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhp6f\" (UniqueName: \"kubernetes.io/projected/0e3d72e5-726a-4f4b-a677-6237021e8747-kube-api-access-lhp6f\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.329260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0e3d72e5-726a-4f4b-a677-6237021e8747-etc-machine-id\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.335217 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e3d72e5-726a-4f4b-a677-6237021e8747-logs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.336383 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-public-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.340147 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-config-data-custom\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.340251 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-scripts\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.342073 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-config-data\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.343134 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.345220 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e3d72e5-726a-4f4b-a677-6237021e8747-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.349111 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhp6f\" (UniqueName: \"kubernetes.io/projected/0e3d72e5-726a-4f4b-a677-6237021e8747-kube-api-access-lhp6f\") pod \"cinder-api-0\" (UID: \"0e3d72e5-726a-4f4b-a677-6237021e8747\") " pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.600003 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.616485 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-86bc664f5b-6lklq"] Dec 02 08:05:29 crc kubenswrapper[4691]: W1202 08:05:29.620936 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff11f2ca_96b6_4cd2_85b8_88916b74efc7.slice/crio-de856462ac715dbc2fabde494f742d947d934f840109726c5324010b270ed5b2 WatchSource:0}: Error finding container de856462ac715dbc2fabde494f742d947d934f840109726c5324010b270ed5b2: Status 404 returned error can't find the container with id de856462ac715dbc2fabde494f742d947d934f840109726c5324010b270ed5b2 Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.798337 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29c75f3-7434-4bf1-a23e-818f8d998754","Type":"ContainerStarted","Data":"9eb9caf7e2c0b48ad0d9233078e0005064fa9a0585e725ec7e08433698f05250"} Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.805631 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" event={"ID":"d855ba5e-92ab-4a5e-b613-f49c9fec44b1","Type":"ContainerStarted","Data":"79480a3b2626c5bfddb33c6a7c4ba5bab76db38c94acdcb9605003f17e2701af"} Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.805681 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" event={"ID":"d855ba5e-92ab-4a5e-b613-f49c9fec44b1","Type":"ContainerStarted","Data":"a8882e27e928459ffaef4937350da19c5e7d8f595e602d7bb0f9a9a9f2240375"} Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.831385 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-57597556c5-xzp56" event={"ID":"3c13685c-ff97-4074-8bc8-5659d16ec95d","Type":"ContainerStarted","Data":"98b19a9bf22bf648ed0601bfe82f51adc9199b80dd9cb4c2704e4f33373904eb"} Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.846003 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=7.8030648970000005 podStartE2EDuration="8.845978834s" podCreationTimestamp="2025-12-02 08:05:21 +0000 UTC" firstStartedPulling="2025-12-02 08:05:25.025428423 +0000 UTC m=+1172.809507285" lastFinishedPulling="2025-12-02 08:05:26.06834236 +0000 UTC m=+1173.852421222" observedRunningTime="2025-12-02 08:05:29.843891732 +0000 UTC m=+1177.627970594" watchObservedRunningTime="2025-12-02 08:05:29.845978834 +0000 UTC m=+1177.630057696" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.879861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bc664f5b-6lklq" event={"ID":"ff11f2ca-96b6-4cd2-85b8-88916b74efc7","Type":"ContainerStarted","Data":"de856462ac715dbc2fabde494f742d947d934f840109726c5324010b270ed5b2"} Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.920385 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c","Type":"ContainerDied","Data":"4ada966b7afba1b3f5e9530b999abd22ea7a850046b315324901727f54ed8195"} Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.920447 4691 scope.go:117] "RemoveContainer" containerID="7d2275e4170bb9ead85515498234bdace94b051a083ba93fafd25980fadfcf7e" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.920621 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.964705 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-598b69485d-n2fl7" podStartSLOduration=4.695286661 podStartE2EDuration="7.964677421s" podCreationTimestamp="2025-12-02 08:05:22 +0000 UTC" firstStartedPulling="2025-12-02 08:05:24.954320695 +0000 UTC m=+1172.738399557" lastFinishedPulling="2025-12-02 08:05:28.223711455 +0000 UTC m=+1176.007790317" observedRunningTime="2025-12-02 08:05:29.909917565 +0000 UTC m=+1177.693996427" watchObservedRunningTime="2025-12-02 08:05:29.964677421 +0000 UTC m=+1177.748756283" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.969438 4691 scope.go:117] "RemoveContainer" containerID="115d58de60e7de31dab6797bc7762256d680a06f202da1d69c04f40630ab01e2" Dec 02 08:05:29 crc kubenswrapper[4691]: I1202 08:05:29.986140 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-57597556c5-xzp56" podStartSLOduration=4.772979041 podStartE2EDuration="7.986113408s" podCreationTimestamp="2025-12-02 08:05:22 +0000 UTC" firstStartedPulling="2025-12-02 08:05:25.012959856 +0000 UTC m=+1172.797038718" lastFinishedPulling="2025-12-02 08:05:28.226094223 +0000 UTC m=+1176.010173085" observedRunningTime="2025-12-02 08:05:29.934956211 +0000 UTC m=+1177.719035073" watchObservedRunningTime="2025-12-02 08:05:29.986113408 +0000 UTC m=+1177.770192260" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.071732 4691 scope.go:117] "RemoveContainer" containerID="f9d31b20778f758e91f3f422397acd3b8b8c5f7098632a78bd1e715e0847037f" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.084199 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.105094 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.115880 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.124565 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.155980 4691 scope.go:117] "RemoveContainer" containerID="e69205269ae58f6dcbb3393c3e2d577675c6f416145c65fe9bb77fe4cfbb3d48" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.184201 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.184340 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.187058 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.189581 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266020 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drg65\" (UniqueName: \"kubernetes.io/projected/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-kube-api-access-drg65\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266486 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-log-httpd\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266567 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266611 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-run-httpd\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266647 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-config-data\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266671 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.266699 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-scripts\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369621 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-run-httpd\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369743 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-config-data\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369788 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369817 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-scripts\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369857 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drg65\" (UniqueName: \"kubernetes.io/projected/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-kube-api-access-drg65\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.369953 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-log-httpd\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.370547 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-log-httpd\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.372210 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-run-httpd\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.374842 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.376226 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.377330 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-scripts\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.397006 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-config-data\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.401400 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drg65\" (UniqueName: \"kubernetes.io/projected/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-kube-api-access-drg65\") pod \"ceilometer-0\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.533723 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.574753 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24910bce-2ac5-4966-af8e-48dad2b11370" path="/var/lib/kubelet/pods/24910bce-2ac5-4966-af8e-48dad2b11370/volumes" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.575428 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41935cdd-b285-4d02-b181-9675b1570b83" path="/var/lib/kubelet/pods/41935cdd-b285-4d02-b181-9675b1570b83/volumes" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.576183 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c" path="/var/lib/kubelet/pods/7e0dc5ba-eb39-4cb8-9c8e-c5a087c6825c/volumes" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.942601 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e3d72e5-726a-4f4b-a677-6237021e8747","Type":"ContainerStarted","Data":"1a99085089c6ad711d88ca89bbaeabea7afdeb87bd007b91d7fb04c95ac3db09"} Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.947477 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bc664f5b-6lklq" event={"ID":"ff11f2ca-96b6-4cd2-85b8-88916b74efc7","Type":"ContainerStarted","Data":"7ad49315ad66efbd0a5e69aff99275cd7ba79f0ad8aff8336f5184dc5589436a"} Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.947512 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-86bc664f5b-6lklq" event={"ID":"ff11f2ca-96b6-4cd2-85b8-88916b74efc7","Type":"ContainerStarted","Data":"bfe0bd33737573ca584e3e5fe4d53c8a38f8bc3f1d387b8ae081fedd9f65f605"} Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.948611 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.948861 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:30 crc kubenswrapper[4691]: I1202 08:05:30.981360 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-86bc664f5b-6lklq" podStartSLOduration=2.981337694 podStartE2EDuration="2.981337694s" podCreationTimestamp="2025-12-02 08:05:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:30.971944683 +0000 UTC m=+1178.756023545" watchObservedRunningTime="2025-12-02 08:05:30.981337694 +0000 UTC m=+1178.765416556" Dec 02 08:05:31 crc kubenswrapper[4691]: I1202 08:05:31.072985 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:31 crc kubenswrapper[4691]: W1202 08:05:31.078977 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcba2ebbb_0458_4a48_ae0f_dc8a99f33c07.slice/crio-453cc329040315ecb996b7f4e79ab36dc1695d9ff9cf64bb579ab9d35c8b6eb3 WatchSource:0}: Error finding container 453cc329040315ecb996b7f4e79ab36dc1695d9ff9cf64bb579ab9d35c8b6eb3: Status 404 returned error can't find the container with id 453cc329040315ecb996b7f4e79ab36dc1695d9ff9cf64bb579ab9d35c8b6eb3 Dec 02 08:05:31 crc kubenswrapper[4691]: I1202 08:05:31.709418 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-95567dd97-rcpxr" Dec 02 08:05:31 crc kubenswrapper[4691]: I1202 08:05:31.802237 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54f65c9796-xhrrr"] Dec 02 08:05:31 crc kubenswrapper[4691]: I1202 08:05:31.802479 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54f65c9796-xhrrr" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-api" containerID="cri-o://71151114337b8625dba3b9392398eed09d96a2c6af0ac7b621b5a223d7aac372" gracePeriod=30 Dec 02 08:05:31 crc kubenswrapper[4691]: I1202 08:05:31.802607 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-54f65c9796-xhrrr" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-httpd" containerID="cri-o://2120bbc5ef060ade1923bcfcc752251e72b4c912cc46b65a5f836d511bff345d" gracePeriod=30 Dec 02 08:05:31 crc kubenswrapper[4691]: I1202 08:05:31.980748 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e3d72e5-726a-4f4b-a677-6237021e8747","Type":"ContainerStarted","Data":"78680ff6d7c6228b3be1b9d0cb7999353e8b3bcb40cb3ad8e5358750255682b0"} Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.003013 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerStarted","Data":"f2f72461cefac4009499f370f2aee2dba073b86341673e63b08e3bdde9a08fea"} Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.003076 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerStarted","Data":"453cc329040315ecb996b7f4e79ab36dc1695d9ff9cf64bb579ab9d35c8b6eb3"} Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.032232 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.326471 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.855936 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.915118 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-qrkdk"] Dec 02 08:05:32 crc kubenswrapper[4691]: I1202 08:05:32.916610 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerName="dnsmasq-dns" containerID="cri-o://d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6" gracePeriod=10 Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.018264 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerStarted","Data":"fd9fd6183d7b0ef9a454e4d763e05be12eb9a2def7ebf5604b2a4b828d787944"} Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.028869 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"0e3d72e5-726a-4f4b-a677-6237021e8747","Type":"ContainerStarted","Data":"16090304f5e0e031c8c473e5ddd9744d3eb3279aacde728b7fd33baee3f3fe1f"} Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.030479 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.039643 4691 generic.go:334] "Generic (PLEG): container finished" podID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerID="2120bbc5ef060ade1923bcfcc752251e72b4c912cc46b65a5f836d511bff345d" exitCode=0 Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.040292 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f65c9796-xhrrr" event={"ID":"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4","Type":"ContainerDied","Data":"2120bbc5ef060ade1923bcfcc752251e72b4c912cc46b65a5f836d511bff345d"} Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.083736 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.083717675 podStartE2EDuration="4.083717675s" podCreationTimestamp="2025-12-02 08:05:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:33.077642206 +0000 UTC m=+1180.861721068" watchObservedRunningTime="2025-12-02 08:05:33.083717675 +0000 UTC m=+1180.867796537" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.138543 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.558426 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.649697 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-swift-storage-0\") pod \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.649742 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-sb\") pod \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.649795 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjp99\" (UniqueName: \"kubernetes.io/projected/fb8ecb10-bbe6-4998-83cb-97441258d0c9-kube-api-access-qjp99\") pod \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.650550 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-config\") pod \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.650654 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-nb\") pod \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.650685 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-svc\") pod \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\" (UID: \"fb8ecb10-bbe6-4998-83cb-97441258d0c9\") " Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.676234 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8ecb10-bbe6-4998-83cb-97441258d0c9-kube-api-access-qjp99" (OuterVolumeSpecName: "kube-api-access-qjp99") pod "fb8ecb10-bbe6-4998-83cb-97441258d0c9" (UID: "fb8ecb10-bbe6-4998-83cb-97441258d0c9"). InnerVolumeSpecName "kube-api-access-qjp99". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.709727 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-config" (OuterVolumeSpecName: "config") pod "fb8ecb10-bbe6-4998-83cb-97441258d0c9" (UID: "fb8ecb10-bbe6-4998-83cb-97441258d0c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.709770 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb8ecb10-bbe6-4998-83cb-97441258d0c9" (UID: "fb8ecb10-bbe6-4998-83cb-97441258d0c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.717247 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb8ecb10-bbe6-4998-83cb-97441258d0c9" (UID: "fb8ecb10-bbe6-4998-83cb-97441258d0c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.742346 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fb8ecb10-bbe6-4998-83cb-97441258d0c9" (UID: "fb8ecb10-bbe6-4998-83cb-97441258d0c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.750354 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb8ecb10-bbe6-4998-83cb-97441258d0c9" (UID: "fb8ecb10-bbe6-4998-83cb-97441258d0c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.753663 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.753702 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.753715 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjp99\" (UniqueName: \"kubernetes.io/projected/fb8ecb10-bbe6-4998-83cb-97441258d0c9-kube-api-access-qjp99\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.753732 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.753746 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:33 crc kubenswrapper[4691]: I1202 08:05:33.753787 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb8ecb10-bbe6-4998-83cb-97441258d0c9-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.053834 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerStarted","Data":"bd4fa8d4f08b3370eb4b2ee453c12865833b8f49bbc5e312a060a4964d03ca3f"} Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.056326 4691 generic.go:334] "Generic (PLEG): container finished" podID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerID="d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6" exitCode=0 Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.056541 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="cinder-scheduler" containerID="cri-o://da54da10259eaa22d6d8c15d99233d36b56b2ebfd4eaeb3b712b5e6854ccf7d9" gracePeriod=30 Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.056908 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.061905 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" event={"ID":"fb8ecb10-bbe6-4998-83cb-97441258d0c9","Type":"ContainerDied","Data":"d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6"} Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.061996 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-qrkdk" event={"ID":"fb8ecb10-bbe6-4998-83cb-97441258d0c9","Type":"ContainerDied","Data":"0ce5bf8864400b065cd968591033518daa777fa46a186c6b51011a8411cbe770"} Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.062022 4691 scope.go:117] "RemoveContainer" containerID="d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.061905 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="probe" containerID="cri-o://9eb9caf7e2c0b48ad0d9233078e0005064fa9a0585e725ec7e08433698f05250" gracePeriod=30 Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.114048 4691 scope.go:117] "RemoveContainer" containerID="e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.114424 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-qrkdk"] Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.138723 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-qrkdk"] Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.162852 4691 scope.go:117] "RemoveContainer" containerID="d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6" Dec 02 08:05:34 crc kubenswrapper[4691]: E1202 08:05:34.167321 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6\": container with ID starting with d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6 not found: ID does not exist" containerID="d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.167371 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6"} err="failed to get container status \"d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6\": rpc error: code = NotFound desc = could not find container \"d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6\": container with ID starting with d7ca0d25e82e17a5f072ba94107b937729f766ae50a5dc4586e02ec8fc1dc1c6 not found: ID does not exist" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.167401 4691 scope.go:117] "RemoveContainer" containerID="e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4" Dec 02 08:05:34 crc kubenswrapper[4691]: E1202 08:05:34.167954 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4\": container with ID starting with e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4 not found: ID does not exist" containerID="e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.168013 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4"} err="failed to get container status \"e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4\": rpc error: code = NotFound desc = could not find container \"e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4\": container with ID starting with e83e3bd223ae0c3f5154acceccdb2269d67b02adb72ef1c750e5fa4b87b239a4 not found: ID does not exist" Dec 02 08:05:34 crc kubenswrapper[4691]: I1202 08:05:34.587156 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" path="/var/lib/kubelet/pods/fb8ecb10-bbe6-4998-83cb-97441258d0c9/volumes" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.091150 4691 generic.go:334] "Generic (PLEG): container finished" podID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerID="71151114337b8625dba3b9392398eed09d96a2c6af0ac7b621b5a223d7aac372" exitCode=0 Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.091844 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f65c9796-xhrrr" event={"ID":"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4","Type":"ContainerDied","Data":"71151114337b8625dba3b9392398eed09d96a2c6af0ac7b621b5a223d7aac372"} Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.444698 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.477128 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.513443 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-combined-ca-bundle\") pod \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.513497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-config\") pod \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.513590 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lktp2\" (UniqueName: \"kubernetes.io/projected/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-kube-api-access-lktp2\") pod \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.513643 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-httpd-config\") pod \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.513702 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-ovndb-tls-certs\") pod \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\" (UID: \"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4\") " Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.533088 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" (UID: "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.536994 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-kube-api-access-lktp2" (OuterVolumeSpecName: "kube-api-access-lktp2") pod "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" (UID: "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4"). InnerVolumeSpecName "kube-api-access-lktp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.615625 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lktp2\" (UniqueName: \"kubernetes.io/projected/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-kube-api-access-lktp2\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.615663 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.649393 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-config" (OuterVolumeSpecName: "config") pod "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" (UID: "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.651971 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" (UID: "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.701165 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" (UID: "7ef5a391-9f07-4f4d-a2c1-2debb63f23a4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.717598 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.717928 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.718061 4691 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:35 crc kubenswrapper[4691]: I1202 08:05:35.727854 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.130114 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerStarted","Data":"4176c2c95fe6b8f3cb749208c816d7f9f593344da72d8656dbac7b8aa77f0e5b"} Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.131951 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.146510 4691 generic.go:334] "Generic (PLEG): container finished" podID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerID="9eb9caf7e2c0b48ad0d9233078e0005064fa9a0585e725ec7e08433698f05250" exitCode=0 Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.146552 4691 generic.go:334] "Generic (PLEG): container finished" podID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerID="da54da10259eaa22d6d8c15d99233d36b56b2ebfd4eaeb3b712b5e6854ccf7d9" exitCode=0 Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.146588 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29c75f3-7434-4bf1-a23e-818f8d998754","Type":"ContainerDied","Data":"9eb9caf7e2c0b48ad0d9233078e0005064fa9a0585e725ec7e08433698f05250"} Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.146634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29c75f3-7434-4bf1-a23e-818f8d998754","Type":"ContainerDied","Data":"da54da10259eaa22d6d8c15d99233d36b56b2ebfd4eaeb3b712b5e6854ccf7d9"} Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.149541 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-54f65c9796-xhrrr" event={"ID":"7ef5a391-9f07-4f4d-a2c1-2debb63f23a4","Type":"ContainerDied","Data":"93639c790695e3a5bab26dbed6b0cb96b131177716dde75edf91576dc4d213ec"} Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.149577 4691 scope.go:117] "RemoveContainer" containerID="2120bbc5ef060ade1923bcfcc752251e72b4c912cc46b65a5f836d511bff345d" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.149581 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-54f65c9796-xhrrr" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.167139 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.337476132 podStartE2EDuration="6.167113414s" podCreationTimestamp="2025-12-02 08:05:30 +0000 UTC" firstStartedPulling="2025-12-02 08:05:31.082375408 +0000 UTC m=+1178.866454270" lastFinishedPulling="2025-12-02 08:05:34.91201269 +0000 UTC m=+1182.696091552" observedRunningTime="2025-12-02 08:05:36.166507009 +0000 UTC m=+1183.950585901" watchObservedRunningTime="2025-12-02 08:05:36.167113414 +0000 UTC m=+1183.951192276" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.191949 4691 scope.go:117] "RemoveContainer" containerID="71151114337b8625dba3b9392398eed09d96a2c6af0ac7b621b5a223d7aac372" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.200347 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-54f65c9796-xhrrr"] Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.210045 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-54f65c9796-xhrrr"] Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.348989 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.585633 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" path="/var/lib/kubelet/pods/7ef5a391-9f07-4f4d-a2c1-2debb63f23a4/volumes" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.623119 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.750384 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-scripts\") pod \"e29c75f3-7434-4bf1-a23e-818f8d998754\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.750598 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-combined-ca-bundle\") pod \"e29c75f3-7434-4bf1-a23e-818f8d998754\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.750713 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29c75f3-7434-4bf1-a23e-818f8d998754-etc-machine-id\") pod \"e29c75f3-7434-4bf1-a23e-818f8d998754\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.750793 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data-custom\") pod \"e29c75f3-7434-4bf1-a23e-818f8d998754\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.750821 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data\") pod \"e29c75f3-7434-4bf1-a23e-818f8d998754\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.750839 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjmgj\" (UniqueName: \"kubernetes.io/projected/e29c75f3-7434-4bf1-a23e-818f8d998754-kube-api-access-hjmgj\") pod \"e29c75f3-7434-4bf1-a23e-818f8d998754\" (UID: \"e29c75f3-7434-4bf1-a23e-818f8d998754\") " Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.754832 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e29c75f3-7434-4bf1-a23e-818f8d998754-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e29c75f3-7434-4bf1-a23e-818f8d998754" (UID: "e29c75f3-7434-4bf1-a23e-818f8d998754"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.802122 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e29c75f3-7434-4bf1-a23e-818f8d998754" (UID: "e29c75f3-7434-4bf1-a23e-818f8d998754"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.802992 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29c75f3-7434-4bf1-a23e-818f8d998754-kube-api-access-hjmgj" (OuterVolumeSpecName: "kube-api-access-hjmgj") pod "e29c75f3-7434-4bf1-a23e-818f8d998754" (UID: "e29c75f3-7434-4bf1-a23e-818f8d998754"). InnerVolumeSpecName "kube-api-access-hjmgj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.803184 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-scripts" (OuterVolumeSpecName: "scripts") pod "e29c75f3-7434-4bf1-a23e-818f8d998754" (UID: "e29c75f3-7434-4bf1-a23e-818f8d998754"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.858947 4691 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e29c75f3-7434-4bf1-a23e-818f8d998754-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.858986 4691 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.858995 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjmgj\" (UniqueName: \"kubernetes.io/projected/e29c75f3-7434-4bf1-a23e-818f8d998754-kube-api-access-hjmgj\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.859007 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.921935 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e29c75f3-7434-4bf1-a23e-818f8d998754" (UID: "e29c75f3-7434-4bf1-a23e-818f8d998754"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.930869 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data" (OuterVolumeSpecName: "config-data") pod "e29c75f3-7434-4bf1-a23e-818f8d998754" (UID: "e29c75f3-7434-4bf1-a23e-818f8d998754"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.961713 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:36 crc kubenswrapper[4691]: I1202 08:05:36.962083 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e29c75f3-7434-4bf1-a23e-818f8d998754-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.168353 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.169733 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e29c75f3-7434-4bf1-a23e-818f8d998754","Type":"ContainerDied","Data":"24d2215bbf01aaed09e3a03b70b6a30dc8889f5db91d9be7d0460d248bec2b1f"} Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.169877 4691 scope.go:117] "RemoveContainer" containerID="9eb9caf7e2c0b48ad0d9233078e0005064fa9a0585e725ec7e08433698f05250" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.201176 4691 scope.go:117] "RemoveContainer" containerID="da54da10259eaa22d6d8c15d99233d36b56b2ebfd4eaeb3b712b5e6854ccf7d9" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.217645 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.224791 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.264157 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:37 crc kubenswrapper[4691]: E1202 08:05:37.265168 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="cinder-scheduler" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.265288 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="cinder-scheduler" Dec 02 08:05:37 crc kubenswrapper[4691]: E1202 08:05:37.265358 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-httpd" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.265424 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-httpd" Dec 02 08:05:37 crc kubenswrapper[4691]: E1202 08:05:37.265481 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerName="dnsmasq-dns" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.265534 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerName="dnsmasq-dns" Dec 02 08:05:37 crc kubenswrapper[4691]: E1202 08:05:37.265613 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerName="init" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.265667 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerName="init" Dec 02 08:05:37 crc kubenswrapper[4691]: E1202 08:05:37.265738 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="probe" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.265832 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="probe" Dec 02 08:05:37 crc kubenswrapper[4691]: E1202 08:05:37.266005 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-api" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.266087 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-api" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.266357 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="cinder-scheduler" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.266487 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-api" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.266556 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb8ecb10-bbe6-4998-83cb-97441258d0c9" containerName="dnsmasq-dns" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.266620 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ef5a391-9f07-4f4d-a2c1-2debb63f23a4" containerName="neutron-httpd" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.266683 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" containerName="probe" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.270743 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.285448 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.296310 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.373488 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.373616 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.373680 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.373703 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.373732 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btdr2\" (UniqueName: \"kubernetes.io/projected/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-kube-api-access-btdr2\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.373777 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.427528 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-99bc7c96-6nbmb" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.475672 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.475778 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.475831 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btdr2\" (UniqueName: \"kubernetes.io/projected/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-kube-api-access-btdr2\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.475858 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.476021 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.476248 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.476407 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.482129 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-scripts\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.483389 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.487504 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.492085 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-config-data\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.504196 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btdr2\" (UniqueName: \"kubernetes.io/projected/cd7d9765-5aa9-4f8d-af36-53dfbba7da81-kube-api-access-btdr2\") pod \"cinder-scheduler-0\" (UID: \"cd7d9765-5aa9-4f8d-af36-53dfbba7da81\") " pod="openstack/cinder-scheduler-0" Dec 02 08:05:37 crc kubenswrapper[4691]: I1202 08:05:37.728853 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 02 08:05:38 crc kubenswrapper[4691]: I1202 08:05:38.383640 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:05:38 crc kubenswrapper[4691]: I1202 08:05:38.391094 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 02 08:05:38 crc kubenswrapper[4691]: I1202 08:05:38.581207 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29c75f3-7434-4bf1-a23e-818f8d998754" path="/var/lib/kubelet/pods/e29c75f3-7434-4bf1-a23e-818f8d998754/volumes" Dec 02 08:05:39 crc kubenswrapper[4691]: I1202 08:05:39.166036 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:05:39 crc kubenswrapper[4691]: I1202 08:05:39.230082 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd7d9765-5aa9-4f8d-af36-53dfbba7da81","Type":"ContainerStarted","Data":"f549e0b7891fea371c2045df714a29753d876ac376512ef7167b6d1060ed2a7b"} Dec 02 08:05:39 crc kubenswrapper[4691]: I1202 08:05:39.230126 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd7d9765-5aa9-4f8d-af36-53dfbba7da81","Type":"ContainerStarted","Data":"54776887dcc55f73cfd951d41fc33dc8e43ebccf0e0c7a873b84161104d28b67"} Dec 02 08:05:40 crc kubenswrapper[4691]: I1202 08:05:40.255657 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"cd7d9765-5aa9-4f8d-af36-53dfbba7da81","Type":"ContainerStarted","Data":"7d832b54a5c699f8bb2454cf0fcbff8b6ca0ce93833c26c5ad50c18e974ce02a"} Dec 02 08:05:40 crc kubenswrapper[4691]: I1202 08:05:40.303695 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.30366818 podStartE2EDuration="3.30366818s" podCreationTimestamp="2025-12-02 08:05:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:40.285974725 +0000 UTC m=+1188.070053587" watchObservedRunningTime="2025-12-02 08:05:40.30366818 +0000 UTC m=+1188.087747042" Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.524158 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-55c7cfcf8b-8rs5b" Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.744671 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.764890 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6585c7db4b-jz894" Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.858923 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758b4cf594-fpkds"] Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.859186 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon-log" containerID="cri-o://e8af3edd4341dbc165ca8c818abe8117ef90323a541e8dbd9510d558ed71bb9f" gracePeriod=30 Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.859319 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" containerID="cri-o://fe343d6838838b4d379b0b348314566df9f6e06b542365de413698b6e144a093" gracePeriod=30 Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.869810 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 02 08:05:41 crc kubenswrapper[4691]: I1202 08:05:41.873534 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": EOF" Dec 02 08:05:42 crc kubenswrapper[4691]: I1202 08:05:42.268974 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-86bc664f5b-6lklq" Dec 02 08:05:42 crc kubenswrapper[4691]: I1202 08:05:42.344680 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56b46d4bbb-w8vmj"] Dec 02 08:05:42 crc kubenswrapper[4691]: I1202 08:05:42.344967 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56b46d4bbb-w8vmj" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api-log" containerID="cri-o://c6495d7a8e820854aaed8fdebfe3bef2ad068886c18ffc5ab9fec81e65cdc7f9" gracePeriod=30 Dec 02 08:05:42 crc kubenswrapper[4691]: I1202 08:05:42.345410 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56b46d4bbb-w8vmj" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api" containerID="cri-o://8dd835aeec3a6b54f83f39a520ec4721c5cafac504197fd018ade1ecbfb91a73" gracePeriod=30 Dec 02 08:05:42 crc kubenswrapper[4691]: I1202 08:05:42.810564 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.286113 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.294360 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.303463 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gr2hv" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.306125 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.317683 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.331370 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.361105 4691 generic.go:334] "Generic (PLEG): container finished" podID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerID="c6495d7a8e820854aaed8fdebfe3bef2ad068886c18ffc5ab9fec81e65cdc7f9" exitCode=143 Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.361164 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b46d4bbb-w8vmj" event={"ID":"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef","Type":"ContainerDied","Data":"c6495d7a8e820854aaed8fdebfe3bef2ad068886c18ffc5ab9fec81e65cdc7f9"} Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.423554 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-openstack-config\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.423644 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.423827 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq7p7\" (UniqueName: \"kubernetes.io/projected/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-kube-api-access-nq7p7\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.423918 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-openstack-config-secret\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.524520 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-openstack-config-secret\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.524635 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-openstack-config\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.524669 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.524778 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq7p7\" (UniqueName: \"kubernetes.io/projected/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-kube-api-access-nq7p7\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.526546 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-openstack-config\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.544641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-openstack-config-secret\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.553572 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.555749 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq7p7\" (UniqueName: \"kubernetes.io/projected/8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7-kube-api-access-nq7p7\") pod \"openstackclient\" (UID: \"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7\") " pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.633134 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 02 08:05:43 crc kubenswrapper[4691]: I1202 08:05:43.982460 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 02 08:05:44 crc kubenswrapper[4691]: I1202 08:05:44.348551 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 02 08:05:44 crc kubenswrapper[4691]: I1202 08:05:44.418000 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7","Type":"ContainerStarted","Data":"3e6b621a42d57243f48e78fe0a00582c1796a7240f2e28739c5c6ace9dcb453e"} Dec 02 08:05:45 crc kubenswrapper[4691]: I1202 08:05:45.906436 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56b46d4bbb-w8vmj" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:46690->10.217.0.162:9311: read: connection reset by peer" Dec 02 08:05:45 crc kubenswrapper[4691]: I1202 08:05:45.906605 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56b46d4bbb-w8vmj" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.162:9311/healthcheck\": read tcp 10.217.0.2:46706->10.217.0.162:9311: read: connection reset by peer" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.475101 4691 generic.go:334] "Generic (PLEG): container finished" podID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerID="8dd835aeec3a6b54f83f39a520ec4721c5cafac504197fd018ade1ecbfb91a73" exitCode=0 Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.475645 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b46d4bbb-w8vmj" event={"ID":"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef","Type":"ContainerDied","Data":"8dd835aeec3a6b54f83f39a520ec4721c5cafac504197fd018ade1ecbfb91a73"} Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.475695 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56b46d4bbb-w8vmj" event={"ID":"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef","Type":"ContainerDied","Data":"66fd4864b68e7c0bf7e09c2b400e350fcde7956ffa7c102049d4dab1a9720d3b"} Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.475717 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66fd4864b68e7c0bf7e09c2b400e350fcde7956ffa7c102049d4dab1a9720d3b" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.512935 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.614609 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data-custom\") pod \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.614910 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-combined-ca-bundle\") pod \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.615098 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-logs\") pod \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.615144 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xn6v\" (UniqueName: \"kubernetes.io/projected/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-kube-api-access-9xn6v\") pod \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.615313 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data\") pod \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\" (UID: \"65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef\") " Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.622850 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-logs" (OuterVolumeSpecName: "logs") pod "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" (UID: "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.657375 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" (UID: "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.665327 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-kube-api-access-9xn6v" (OuterVolumeSpecName: "kube-api-access-9xn6v") pod "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" (UID: "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef"). InnerVolumeSpecName "kube-api-access-9xn6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.712529 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" (UID: "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.718123 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.718152 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.718164 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xn6v\" (UniqueName: \"kubernetes.io/projected/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-kube-api-access-9xn6v\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.718175 4691 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.809360 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data" (OuterVolumeSpecName: "config-data") pod "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" (UID: "65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:46 crc kubenswrapper[4691]: I1202 08:05:46.820101 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:47 crc kubenswrapper[4691]: I1202 08:05:47.336350 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": read tcp 10.217.0.2:53096->10.217.0.146:8443: read: connection reset by peer" Dec 02 08:05:47 crc kubenswrapper[4691]: I1202 08:05:47.499087 4691 generic.go:334] "Generic (PLEG): container finished" podID="a95b5239-be71-4b06-88b2-52875915162e" containerID="fe343d6838838b4d379b0b348314566df9f6e06b542365de413698b6e144a093" exitCode=0 Dec 02 08:05:47 crc kubenswrapper[4691]: I1202 08:05:47.499195 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56b46d4bbb-w8vmj" Dec 02 08:05:47 crc kubenswrapper[4691]: I1202 08:05:47.499539 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758b4cf594-fpkds" event={"ID":"a95b5239-be71-4b06-88b2-52875915162e","Type":"ContainerDied","Data":"fe343d6838838b4d379b0b348314566df9f6e06b542365de413698b6e144a093"} Dec 02 08:05:47 crc kubenswrapper[4691]: I1202 08:05:47.545889 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56b46d4bbb-w8vmj"] Dec 02 08:05:47 crc kubenswrapper[4691]: I1202 08:05:47.564683 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56b46d4bbb-w8vmj"] Dec 02 08:05:48 crc kubenswrapper[4691]: I1202 08:05:48.140988 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 02 08:05:48 crc kubenswrapper[4691]: I1202 08:05:48.577281 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" path="/var/lib/kubelet/pods/65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef/volumes" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.898396 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-76d578d5f5-hmcbw"] Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.898776 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.900394 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:05:51 crc kubenswrapper[4691]: E1202 08:05:51.901588 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api-log" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.901661 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api-log" Dec 02 08:05:51 crc kubenswrapper[4691]: E1202 08:05:51.901745 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.901787 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.902277 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api-log" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.902336 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e50aa7-24b4-4c0e-b4f2-8fcca54aa8ef" containerName="barbican-api" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.905290 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.915604 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.915795 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.919247 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 02 08:05:51 crc kubenswrapper[4691]: I1202 08:05:51.930180 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76d578d5f5-hmcbw"] Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094079 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4bx8\" (UniqueName: \"kubernetes.io/projected/de7d695d-6d9a-4de2-830e-579f9d496f08-kube-api-access-h4bx8\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094163 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-internal-tls-certs\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094281 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-combined-ca-bundle\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094405 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-public-tls-certs\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094443 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-config-data\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094474 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/de7d695d-6d9a-4de2-830e-579f9d496f08-etc-swift\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094639 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7d695d-6d9a-4de2-830e-579f9d496f08-log-httpd\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.094811 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7d695d-6d9a-4de2-830e-579f9d496f08-run-httpd\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.196982 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-combined-ca-bundle\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197061 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-public-tls-certs\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197108 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-config-data\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197144 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/de7d695d-6d9a-4de2-830e-579f9d496f08-etc-swift\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197176 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7d695d-6d9a-4de2-830e-579f9d496f08-log-httpd\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197213 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7d695d-6d9a-4de2-830e-579f9d496f08-run-httpd\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197265 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4bx8\" (UniqueName: \"kubernetes.io/projected/de7d695d-6d9a-4de2-830e-579f9d496f08-kube-api-access-h4bx8\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.197313 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-internal-tls-certs\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.198277 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7d695d-6d9a-4de2-830e-579f9d496f08-run-httpd\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.198475 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7d695d-6d9a-4de2-830e-579f9d496f08-log-httpd\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.203844 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-config-data\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.203903 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-internal-tls-certs\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.213634 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-combined-ca-bundle\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.213912 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/de7d695d-6d9a-4de2-830e-579f9d496f08-etc-swift\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.214566 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/de7d695d-6d9a-4de2-830e-579f9d496f08-public-tls-certs\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.217748 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4bx8\" (UniqueName: \"kubernetes.io/projected/de7d695d-6d9a-4de2-830e-579f9d496f08-kube-api-access-h4bx8\") pod \"swift-proxy-76d578d5f5-hmcbw\" (UID: \"de7d695d-6d9a-4de2-830e-579f9d496f08\") " pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.249373 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.905523 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.905902 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-central-agent" containerID="cri-o://f2f72461cefac4009499f370f2aee2dba073b86341673e63b08e3bdde9a08fea" gracePeriod=30 Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.906059 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="proxy-httpd" containerID="cri-o://4176c2c95fe6b8f3cb749208c816d7f9f593344da72d8656dbac7b8aa77f0e5b" gracePeriod=30 Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.906114 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="sg-core" containerID="cri-o://bd4fa8d4f08b3370eb4b2ee453c12865833b8f49bbc5e312a060a4964d03ca3f" gracePeriod=30 Dec 02 08:05:52 crc kubenswrapper[4691]: I1202 08:05:52.906199 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-notification-agent" containerID="cri-o://fd9fd6183d7b0ef9a454e4d763e05be12eb9a2def7ebf5604b2a4b828d787944" gracePeriod=30 Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.073924 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.165:3000/\": read tcp 10.217.0.2:58012->10.217.0.165:3000: read: connection reset by peer" Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.628290 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerID="4176c2c95fe6b8f3cb749208c816d7f9f593344da72d8656dbac7b8aa77f0e5b" exitCode=0 Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.628325 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerID="bd4fa8d4f08b3370eb4b2ee453c12865833b8f49bbc5e312a060a4964d03ca3f" exitCode=2 Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.628334 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerID="f2f72461cefac4009499f370f2aee2dba073b86341673e63b08e3bdde9a08fea" exitCode=0 Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.628354 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerDied","Data":"4176c2c95fe6b8f3cb749208c816d7f9f593344da72d8656dbac7b8aa77f0e5b"} Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.628381 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerDied","Data":"bd4fa8d4f08b3370eb4b2ee453c12865833b8f49bbc5e312a060a4964d03ca3f"} Dec 02 08:05:53 crc kubenswrapper[4691]: I1202 08:05:53.628393 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerDied","Data":"f2f72461cefac4009499f370f2aee2dba073b86341673e63b08e3bdde9a08fea"} Dec 02 08:05:54 crc kubenswrapper[4691]: I1202 08:05:54.414595 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 02 08:05:54 crc kubenswrapper[4691]: I1202 08:05:54.661964 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerID="fd9fd6183d7b0ef9a454e4d763e05be12eb9a2def7ebf5604b2a4b828d787944" exitCode=0 Dec 02 08:05:54 crc kubenswrapper[4691]: I1202 08:05:54.662112 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerDied","Data":"fd9fd6183d7b0ef9a454e4d763e05be12eb9a2def7ebf5604b2a4b828d787944"} Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.696989 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7","Type":"ContainerStarted","Data":"874eb7f92456da48b9120c0f39b79ed8a9b6428a4dd0b4a5e35b892c34ef2f81"} Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.724714 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.709722953 podStartE2EDuration="14.724683625s" podCreationTimestamp="2025-12-02 08:05:43 +0000 UTC" firstStartedPulling="2025-12-02 08:05:44.388299951 +0000 UTC m=+1192.172378813" lastFinishedPulling="2025-12-02 08:05:57.403260623 +0000 UTC m=+1205.187339485" observedRunningTime="2025-12-02 08:05:57.71714347 +0000 UTC m=+1205.501222332" watchObservedRunningTime="2025-12-02 08:05:57.724683625 +0000 UTC m=+1205.508762487" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.751657 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860277 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-config-data\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860346 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-log-httpd\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860410 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-combined-ca-bundle\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860510 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-run-httpd\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860559 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-scripts\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860663 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drg65\" (UniqueName: \"kubernetes.io/projected/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-kube-api-access-drg65\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.860714 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-sg-core-conf-yaml\") pod \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\" (UID: \"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07\") " Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.861238 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.861713 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.867683 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-kube-api-access-drg65" (OuterVolumeSpecName: "kube-api-access-drg65") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "kube-api-access-drg65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.868168 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-scripts" (OuterVolumeSpecName: "scripts") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.894055 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.941340 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.963309 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.963351 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.963363 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.963374 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.963384 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.963394 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drg65\" (UniqueName: \"kubernetes.io/projected/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-kube-api-access-drg65\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:57 crc kubenswrapper[4691]: I1202 08:05:57.968895 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-config-data" (OuterVolumeSpecName: "config-data") pod "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" (UID: "cba2ebbb-0458-4a48-ae0f-dc8a99f33c07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.070733 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.119671 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-76d578d5f5-hmcbw"] Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.711515 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cba2ebbb-0458-4a48-ae0f-dc8a99f33c07","Type":"ContainerDied","Data":"453cc329040315ecb996b7f4e79ab36dc1695d9ff9cf64bb579ab9d35c8b6eb3"} Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.712405 4691 scope.go:117] "RemoveContainer" containerID="4176c2c95fe6b8f3cb749208c816d7f9f593344da72d8656dbac7b8aa77f0e5b" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.712738 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.719622 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d578d5f5-hmcbw" event={"ID":"de7d695d-6d9a-4de2-830e-579f9d496f08","Type":"ContainerStarted","Data":"39c839057083516200d54fca43bd0fe7f00e945907a6b5c1e3e375015ca4db51"} Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.719707 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d578d5f5-hmcbw" event={"ID":"de7d695d-6d9a-4de2-830e-579f9d496f08","Type":"ContainerStarted","Data":"df094a2145ca20a582b520c605d528ca592f252768b05b70ad3283c722edb5cf"} Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.743378 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.744867 4691 scope.go:117] "RemoveContainer" containerID="bd4fa8d4f08b3370eb4b2ee453c12865833b8f49bbc5e312a060a4964d03ca3f" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.751894 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.772177 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:58 crc kubenswrapper[4691]: E1202 08:05:58.773079 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="sg-core" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773104 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="sg-core" Dec 02 08:05:58 crc kubenswrapper[4691]: E1202 08:05:58.773159 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-notification-agent" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773166 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-notification-agent" Dec 02 08:05:58 crc kubenswrapper[4691]: E1202 08:05:58.773176 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-central-agent" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773183 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-central-agent" Dec 02 08:05:58 crc kubenswrapper[4691]: E1202 08:05:58.773197 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="proxy-httpd" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773202 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="proxy-httpd" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773385 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-notification-agent" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773407 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="proxy-httpd" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773425 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="ceilometer-central-agent" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.773434 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" containerName="sg-core" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.776644 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.780540 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.780990 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.781493 4691 scope.go:117] "RemoveContainer" containerID="fd9fd6183d7b0ef9a454e4d763e05be12eb9a2def7ebf5604b2a4b828d787944" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.795676 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.833670 4691 scope.go:117] "RemoveContainer" containerID="f2f72461cefac4009499f370f2aee2dba073b86341673e63b08e3bdde9a08fea" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.886784 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-scripts\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.886962 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44fl2\" (UniqueName: \"kubernetes.io/projected/7667b055-d667-4488-a733-8f8996295fcf-kube-api-access-44fl2\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.887055 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.887156 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-run-httpd\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.887300 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.887382 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-log-httpd\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.887460 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-config-data\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989450 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-run-httpd\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989526 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989563 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-log-httpd\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989610 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-config-data\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989650 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-scripts\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.989711 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44fl2\" (UniqueName: \"kubernetes.io/projected/7667b055-d667-4488-a733-8f8996295fcf-kube-api-access-44fl2\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.990623 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-run-httpd\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.990752 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-log-httpd\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.995388 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.997334 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-scripts\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:58 crc kubenswrapper[4691]: I1202 08:05:58.998565 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.001029 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-config-data\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.018951 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44fl2\" (UniqueName: \"kubernetes.io/projected/7667b055-d667-4488-a733-8f8996295fcf-kube-api-access-44fl2\") pod \"ceilometer-0\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " pod="openstack/ceilometer-0" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.096810 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.630576 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.731120 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerStarted","Data":"ee207e61d08b823576778b13136f580f64ab79c725b424fa5ebfffd7df024d36"} Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.750349 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-76d578d5f5-hmcbw" event={"ID":"de7d695d-6d9a-4de2-830e-579f9d496f08","Type":"ContainerStarted","Data":"194459ad693fb4344345146dd154819bc486f3400b91481eaa9768a097fa5102"} Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.751093 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.751127 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.781409 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-76d578d5f5-hmcbw" podStartSLOduration=8.781382373 podStartE2EDuration="8.781382373s" podCreationTimestamp="2025-12-02 08:05:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:05:59.77311952 +0000 UTC m=+1207.557198382" watchObservedRunningTime="2025-12-02 08:05:59.781382373 +0000 UTC m=+1207.565461235" Dec 02 08:05:59 crc kubenswrapper[4691]: I1202 08:05:59.887974 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:00 crc kubenswrapper[4691]: I1202 08:06:00.574026 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba2ebbb-0458-4a48-ae0f-dc8a99f33c07" path="/var/lib/kubelet/pods/cba2ebbb-0458-4a48-ae0f-dc8a99f33c07/volumes" Dec 02 08:06:00 crc kubenswrapper[4691]: I1202 08:06:00.981420 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-4p5d6"] Dec 02 08:06:00 crc kubenswrapper[4691]: I1202 08:06:00.987056 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.018544 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4p5d6"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.135418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/911d250b-4d53-4d08-aee0-a92de349d7f1-operator-scripts\") pod \"nova-api-db-create-4p5d6\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.135927 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vttfg\" (UniqueName: \"kubernetes.io/projected/911d250b-4d53-4d08-aee0-a92de349d7f1-kube-api-access-vttfg\") pod \"nova-api-db-create-4p5d6\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.164396 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-zdmsm"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.176037 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.215845 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0427-account-create-update-kprdn"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.218701 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.221402 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.242480 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-operator-scripts\") pod \"nova-cell0-db-create-zdmsm\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.242908 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cb2g\" (UniqueName: \"kubernetes.io/projected/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-kube-api-access-8cb2g\") pod \"nova-cell0-db-create-zdmsm\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.243076 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vttfg\" (UniqueName: \"kubernetes.io/projected/911d250b-4d53-4d08-aee0-a92de349d7f1-kube-api-access-vttfg\") pod \"nova-api-db-create-4p5d6\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.243456 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/911d250b-4d53-4d08-aee0-a92de349d7f1-operator-scripts\") pod \"nova-api-db-create-4p5d6\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.244750 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/911d250b-4d53-4d08-aee0-a92de349d7f1-operator-scripts\") pod \"nova-api-db-create-4p5d6\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.270212 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zdmsm"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.297291 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vttfg\" (UniqueName: \"kubernetes.io/projected/911d250b-4d53-4d08-aee0-a92de349d7f1-kube-api-access-vttfg\") pod \"nova-api-db-create-4p5d6\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.302334 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0427-account-create-update-kprdn"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.333419 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.352682 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-jtr9g"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.354507 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.366847 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jtr9g"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.368217 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f44801-39b7-4ed2-b8b6-3b15e740e058-operator-scripts\") pod \"nova-api-0427-account-create-update-kprdn\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.368279 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2q9r\" (UniqueName: \"kubernetes.io/projected/37f44801-39b7-4ed2-b8b6-3b15e740e058-kube-api-access-r2q9r\") pod \"nova-api-0427-account-create-update-kprdn\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.368319 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-operator-scripts\") pod \"nova-cell0-db-create-zdmsm\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.368357 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cb2g\" (UniqueName: \"kubernetes.io/projected/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-kube-api-access-8cb2g\") pod \"nova-cell0-db-create-zdmsm\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.369601 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-operator-scripts\") pod \"nova-cell0-db-create-zdmsm\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.424230 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cb2g\" (UniqueName: \"kubernetes.io/projected/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-kube-api-access-8cb2g\") pod \"nova-cell0-db-create-zdmsm\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.446561 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-6ba3-account-create-update-qdwjw"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.448779 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.451994 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.461920 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6ba3-account-create-update-qdwjw"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.473062 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4jw\" (UniqueName: \"kubernetes.io/projected/0df042e0-5c89-43c2-aa13-6e894851bc4b-kube-api-access-4f4jw\") pod \"nova-cell1-db-create-jtr9g\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.473391 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f44801-39b7-4ed2-b8b6-3b15e740e058-operator-scripts\") pod \"nova-api-0427-account-create-update-kprdn\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.473499 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df042e0-5c89-43c2-aa13-6e894851bc4b-operator-scripts\") pod \"nova-cell1-db-create-jtr9g\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.473634 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2q9r\" (UniqueName: \"kubernetes.io/projected/37f44801-39b7-4ed2-b8b6-3b15e740e058-kube-api-access-r2q9r\") pod \"nova-api-0427-account-create-update-kprdn\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.474252 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f44801-39b7-4ed2-b8b6-3b15e740e058-operator-scripts\") pod \"nova-api-0427-account-create-update-kprdn\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.510343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2q9r\" (UniqueName: \"kubernetes.io/projected/37f44801-39b7-4ed2-b8b6-3b15e740e058-kube-api-access-r2q9r\") pod \"nova-api-0427-account-create-update-kprdn\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.545456 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.579307 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df042e0-5c89-43c2-aa13-6e894851bc4b-operator-scripts\") pod \"nova-cell1-db-create-jtr9g\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.579463 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4jw\" (UniqueName: \"kubernetes.io/projected/0df042e0-5c89-43c2-aa13-6e894851bc4b-kube-api-access-4f4jw\") pod \"nova-cell1-db-create-jtr9g\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.579491 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrkt4\" (UniqueName: \"kubernetes.io/projected/1aa3343a-ce66-4872-9b9e-d011b842e4d1-kube-api-access-xrkt4\") pod \"nova-cell0-6ba3-account-create-update-qdwjw\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.579538 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa3343a-ce66-4872-9b9e-d011b842e4d1-operator-scripts\") pod \"nova-cell0-6ba3-account-create-update-qdwjw\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.583988 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df042e0-5c89-43c2-aa13-6e894851bc4b-operator-scripts\") pod \"nova-cell1-db-create-jtr9g\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.592074 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2741-account-create-update-v7c8q"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.593923 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.596680 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.600577 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.607695 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4jw\" (UniqueName: \"kubernetes.io/projected/0df042e0-5c89-43c2-aa13-6e894851bc4b-kube-api-access-4f4jw\") pod \"nova-cell1-db-create-jtr9g\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.620059 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2741-account-create-update-v7c8q"] Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.690915 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrkt4\" (UniqueName: \"kubernetes.io/projected/1aa3343a-ce66-4872-9b9e-d011b842e4d1-kube-api-access-xrkt4\") pod \"nova-cell0-6ba3-account-create-update-qdwjw\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.691029 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhc55\" (UniqueName: \"kubernetes.io/projected/eeb15868-5efc-4bc1-a297-9c4517cd23ee-kube-api-access-xhc55\") pod \"nova-cell1-2741-account-create-update-v7c8q\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.691143 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa3343a-ce66-4872-9b9e-d011b842e4d1-operator-scripts\") pod \"nova-cell0-6ba3-account-create-update-qdwjw\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.691681 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb15868-5efc-4bc1-a297-9c4517cd23ee-operator-scripts\") pod \"nova-cell1-2741-account-create-update-v7c8q\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.692998 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa3343a-ce66-4872-9b9e-d011b842e4d1-operator-scripts\") pod \"nova-cell0-6ba3-account-create-update-qdwjw\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.723209 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrkt4\" (UniqueName: \"kubernetes.io/projected/1aa3343a-ce66-4872-9b9e-d011b842e4d1-kube-api-access-xrkt4\") pod \"nova-cell0-6ba3-account-create-update-qdwjw\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.752788 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.788307 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.793346 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhc55\" (UniqueName: \"kubernetes.io/projected/eeb15868-5efc-4bc1-a297-9c4517cd23ee-kube-api-access-xhc55\") pod \"nova-cell1-2741-account-create-update-v7c8q\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.793496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb15868-5efc-4bc1-a297-9c4517cd23ee-operator-scripts\") pod \"nova-cell1-2741-account-create-update-v7c8q\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.794461 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb15868-5efc-4bc1-a297-9c4517cd23ee-operator-scripts\") pod \"nova-cell1-2741-account-create-update-v7c8q\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.825576 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhc55\" (UniqueName: \"kubernetes.io/projected/eeb15868-5efc-4bc1-a297-9c4517cd23ee-kube-api-access-xhc55\") pod \"nova-cell1-2741-account-create-update-v7c8q\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.862161 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerStarted","Data":"dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6"} Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.930112 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:01 crc kubenswrapper[4691]: I1202 08:06:01.935416 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-4p5d6"] Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.393592 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0427-account-create-update-kprdn"] Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.465348 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-zdmsm"] Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.637048 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-6ba3-account-create-update-qdwjw"] Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.661113 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-jtr9g"] Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.783040 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2741-account-create-update-v7c8q"] Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.872850 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p5d6" event={"ID":"911d250b-4d53-4d08-aee0-a92de349d7f1","Type":"ContainerStarted","Data":"b2ad5a41c9ce8abc8151adbd20e160b3e717dcfc7945128a8305501aa320c265"} Dec 02 08:06:02 crc kubenswrapper[4691]: I1202 08:06:02.874829 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0427-account-create-update-kprdn" event={"ID":"37f44801-39b7-4ed2-b8b6-3b15e740e058","Type":"ContainerStarted","Data":"31c9114934cfbb639d45c02409216d381e4034d2fa39f6c759df3d2132fdd403"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.913231 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" event={"ID":"eeb15868-5efc-4bc1-a297-9c4517cd23ee","Type":"ContainerStarted","Data":"6f25090cf32215af12e79085686c46515d96ebaeb0f3fb194a41b702c640f956"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.913632 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" event={"ID":"eeb15868-5efc-4bc1-a297-9c4517cd23ee","Type":"ContainerStarted","Data":"3c017b46d47ad6a2352d8b96a74fd09dd73f3761be05ff7a5d3421040967c832"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.924211 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0427-account-create-update-kprdn" event={"ID":"37f44801-39b7-4ed2-b8b6-3b15e740e058","Type":"ContainerStarted","Data":"3cadf93969314b116f62c846bac718e0bd2b10279e5621f5912ce1e1e7adbe52"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.931861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p5d6" event={"ID":"911d250b-4d53-4d08-aee0-a92de349d7f1","Type":"ContainerStarted","Data":"695c054ef2acea72b1510c0324ce6accc69b94c49396976ff578ed978168ef76"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.945487 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jtr9g" event={"ID":"0df042e0-5c89-43c2-aa13-6e894851bc4b","Type":"ContainerStarted","Data":"7d68c532baf5e3f8b3001b2becb50642186c0dace1229a7271a41b7365150086"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.945536 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jtr9g" event={"ID":"0df042e0-5c89-43c2-aa13-6e894851bc4b","Type":"ContainerStarted","Data":"172fbbfbac1d8dff7dfb13c9b886f6c71ebb3a8e064f683a496a240847961d1a"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.949376 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdmsm" event={"ID":"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1","Type":"ContainerStarted","Data":"eed906381f0e4b018d0fe3d4c29052fc2c4ef4d0983999c691e997fd7798372c"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.949415 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdmsm" event={"ID":"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1","Type":"ContainerStarted","Data":"a6c125cd1d729e77b40e308e5200bf2480bfea1d320fce3b4f735a4f595b99b0"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.949715 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" podStartSLOduration=2.949696481 podStartE2EDuration="2.949696481s" podCreationTimestamp="2025-12-02 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:03.937523952 +0000 UTC m=+1211.721602814" watchObservedRunningTime="2025-12-02 08:06:03.949696481 +0000 UTC m=+1211.733775343" Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.951537 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" event={"ID":"1aa3343a-ce66-4872-9b9e-d011b842e4d1","Type":"ContainerStarted","Data":"84036d7dbc0f957d3fd002f3566d5ad1390b5e8436bc3db79e4d600dec2c129a"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.951564 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" event={"ID":"1aa3343a-ce66-4872-9b9e-d011b842e4d1","Type":"ContainerStarted","Data":"795c53fefe2c0d1748e18562b3d8e4cc04e8b2341b45f6a35477518fafd8b07c"} Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.980288 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-4p5d6" podStartSLOduration=3.980265683 podStartE2EDuration="3.980265683s" podCreationTimestamp="2025-12-02 08:06:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:03.966351381 +0000 UTC m=+1211.750430263" watchObservedRunningTime="2025-12-02 08:06:03.980265683 +0000 UTC m=+1211.764344545" Dec 02 08:06:03 crc kubenswrapper[4691]: I1202 08:06:03.996833 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0427-account-create-update-kprdn" podStartSLOduration=2.99680907 podStartE2EDuration="2.99680907s" podCreationTimestamp="2025-12-02 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:03.993921359 +0000 UTC m=+1211.778000221" watchObservedRunningTime="2025-12-02 08:06:03.99680907 +0000 UTC m=+1211.780887932" Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.029823 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-zdmsm" podStartSLOduration=3.02979615 podStartE2EDuration="3.02979615s" podCreationTimestamp="2025-12-02 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:04.022105361 +0000 UTC m=+1211.806184233" watchObservedRunningTime="2025-12-02 08:06:04.02979615 +0000 UTC m=+1211.813875012" Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.044513 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-jtr9g" podStartSLOduration=3.044492972 podStartE2EDuration="3.044492972s" podCreationTimestamp="2025-12-02 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:04.043315563 +0000 UTC m=+1211.827394445" watchObservedRunningTime="2025-12-02 08:06:04.044492972 +0000 UTC m=+1211.828571834" Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.078815 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" podStartSLOduration=3.078793745 podStartE2EDuration="3.078793745s" podCreationTimestamp="2025-12-02 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:04.069464966 +0000 UTC m=+1211.853543828" watchObservedRunningTime="2025-12-02 08:06:04.078793745 +0000 UTC m=+1211.862872607" Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.378066 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.378411 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-log" containerID="cri-o://d33831a399d765d53b63507e333fdb5487277f3621ed518e8fa5a7dc61efa222" gracePeriod=30 Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.379579 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-httpd" containerID="cri-o://b0b5d57b0a480c140403eaeb18a43bbd953094018058d8bc77a568ea99f8c0ea" gracePeriod=30 Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.416385 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-758b4cf594-fpkds" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.146:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.146:8443: connect: connection refused" Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.967044 4691 generic.go:334] "Generic (PLEG): container finished" podID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerID="d33831a399d765d53b63507e333fdb5487277f3621ed518e8fa5a7dc61efa222" exitCode=143 Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.967121 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a7091d4-3a8f-4c52-b3e8-b35913233371","Type":"ContainerDied","Data":"d33831a399d765d53b63507e333fdb5487277f3621ed518e8fa5a7dc61efa222"} Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.972259 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerStarted","Data":"53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6"} Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.976774 4691 generic.go:334] "Generic (PLEG): container finished" podID="911d250b-4d53-4d08-aee0-a92de349d7f1" containerID="695c054ef2acea72b1510c0324ce6accc69b94c49396976ff578ed978168ef76" exitCode=0 Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.976879 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p5d6" event={"ID":"911d250b-4d53-4d08-aee0-a92de349d7f1","Type":"ContainerDied","Data":"695c054ef2acea72b1510c0324ce6accc69b94c49396976ff578ed978168ef76"} Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.984736 4691 generic.go:334] "Generic (PLEG): container finished" podID="0df042e0-5c89-43c2-aa13-6e894851bc4b" containerID="7d68c532baf5e3f8b3001b2becb50642186c0dace1229a7271a41b7365150086" exitCode=0 Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.984954 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jtr9g" event={"ID":"0df042e0-5c89-43c2-aa13-6e894851bc4b","Type":"ContainerDied","Data":"7d68c532baf5e3f8b3001b2becb50642186c0dace1229a7271a41b7365150086"} Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.992199 4691 generic.go:334] "Generic (PLEG): container finished" podID="a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" containerID="eed906381f0e4b018d0fe3d4c29052fc2c4ef4d0983999c691e997fd7798372c" exitCode=0 Dec 02 08:06:04 crc kubenswrapper[4691]: I1202 08:06:04.992462 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdmsm" event={"ID":"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1","Type":"ContainerDied","Data":"eed906381f0e4b018d0fe3d4c29052fc2c4ef4d0983999c691e997fd7798372c"} Dec 02 08:06:05 crc kubenswrapper[4691]: I1202 08:06:05.006799 4691 generic.go:334] "Generic (PLEG): container finished" podID="1aa3343a-ce66-4872-9b9e-d011b842e4d1" containerID="84036d7dbc0f957d3fd002f3566d5ad1390b5e8436bc3db79e4d600dec2c129a" exitCode=0 Dec 02 08:06:05 crc kubenswrapper[4691]: I1202 08:06:05.006891 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" event={"ID":"1aa3343a-ce66-4872-9b9e-d011b842e4d1","Type":"ContainerDied","Data":"84036d7dbc0f957d3fd002f3566d5ad1390b5e8436bc3db79e4d600dec2c129a"} Dec 02 08:06:05 crc kubenswrapper[4691]: I1202 08:06:05.017323 4691 generic.go:334] "Generic (PLEG): container finished" podID="eeb15868-5efc-4bc1-a297-9c4517cd23ee" containerID="6f25090cf32215af12e79085686c46515d96ebaeb0f3fb194a41b702c640f956" exitCode=0 Dec 02 08:06:05 crc kubenswrapper[4691]: I1202 08:06:05.017403 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" event={"ID":"eeb15868-5efc-4bc1-a297-9c4517cd23ee","Type":"ContainerDied","Data":"6f25090cf32215af12e79085686c46515d96ebaeb0f3fb194a41b702c640f956"} Dec 02 08:06:05 crc kubenswrapper[4691]: I1202 08:06:05.019315 4691 generic.go:334] "Generic (PLEG): container finished" podID="37f44801-39b7-4ed2-b8b6-3b15e740e058" containerID="3cadf93969314b116f62c846bac718e0bd2b10279e5621f5912ce1e1e7adbe52" exitCode=0 Dec 02 08:06:05 crc kubenswrapper[4691]: I1202 08:06:05.019356 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0427-account-create-update-kprdn" event={"ID":"37f44801-39b7-4ed2-b8b6-3b15e740e058","Type":"ContainerDied","Data":"3cadf93969314b116f62c846bac718e0bd2b10279e5621f5912ce1e1e7adbe52"} Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.049367 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerStarted","Data":"f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2"} Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.597352 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.738077 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/911d250b-4d53-4d08-aee0-a92de349d7f1-operator-scripts\") pod \"911d250b-4d53-4d08-aee0-a92de349d7f1\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.738505 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vttfg\" (UniqueName: \"kubernetes.io/projected/911d250b-4d53-4d08-aee0-a92de349d7f1-kube-api-access-vttfg\") pod \"911d250b-4d53-4d08-aee0-a92de349d7f1\" (UID: \"911d250b-4d53-4d08-aee0-a92de349d7f1\") " Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.738956 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911d250b-4d53-4d08-aee0-a92de349d7f1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "911d250b-4d53-4d08-aee0-a92de349d7f1" (UID: "911d250b-4d53-4d08-aee0-a92de349d7f1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.739185 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/911d250b-4d53-4d08-aee0-a92de349d7f1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.784166 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911d250b-4d53-4d08-aee0-a92de349d7f1-kube-api-access-vttfg" (OuterVolumeSpecName: "kube-api-access-vttfg") pod "911d250b-4d53-4d08-aee0-a92de349d7f1" (UID: "911d250b-4d53-4d08-aee0-a92de349d7f1"). InnerVolumeSpecName "kube-api-access-vttfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:06 crc kubenswrapper[4691]: I1202 08:06:06.842236 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vttfg\" (UniqueName: \"kubernetes.io/projected/911d250b-4d53-4d08-aee0-a92de349d7f1-kube-api-access-vttfg\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.000462 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.006008 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.039156 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.045223 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2q9r\" (UniqueName: \"kubernetes.io/projected/37f44801-39b7-4ed2-b8b6-3b15e740e058-kube-api-access-r2q9r\") pod \"37f44801-39b7-4ed2-b8b6-3b15e740e058\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.045291 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhc55\" (UniqueName: \"kubernetes.io/projected/eeb15868-5efc-4bc1-a297-9c4517cd23ee-kube-api-access-xhc55\") pod \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.045325 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cb2g\" (UniqueName: \"kubernetes.io/projected/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-kube-api-access-8cb2g\") pod \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.045363 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f44801-39b7-4ed2-b8b6-3b15e740e058-operator-scripts\") pod \"37f44801-39b7-4ed2-b8b6-3b15e740e058\" (UID: \"37f44801-39b7-4ed2-b8b6-3b15e740e058\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.045395 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-operator-scripts\") pod \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\" (UID: \"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.045451 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb15868-5efc-4bc1-a297-9c4517cd23ee-operator-scripts\") pod \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\" (UID: \"eeb15868-5efc-4bc1-a297-9c4517cd23ee\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.046388 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eeb15868-5efc-4bc1-a297-9c4517cd23ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eeb15868-5efc-4bc1-a297-9c4517cd23ee" (UID: "eeb15868-5efc-4bc1-a297-9c4517cd23ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.047725 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37f44801-39b7-4ed2-b8b6-3b15e740e058-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37f44801-39b7-4ed2-b8b6-3b15e740e058" (UID: "37f44801-39b7-4ed2-b8b6-3b15e740e058"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.060074 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37f44801-39b7-4ed2-b8b6-3b15e740e058-kube-api-access-r2q9r" (OuterVolumeSpecName: "kube-api-access-r2q9r") pod "37f44801-39b7-4ed2-b8b6-3b15e740e058" (UID: "37f44801-39b7-4ed2-b8b6-3b15e740e058"). InnerVolumeSpecName "kube-api-access-r2q9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.060637 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-kube-api-access-8cb2g" (OuterVolumeSpecName: "kube-api-access-8cb2g") pod "a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" (UID: "a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1"). InnerVolumeSpecName "kube-api-access-8cb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.063809 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.068525 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb15868-5efc-4bc1-a297-9c4517cd23ee-kube-api-access-xhc55" (OuterVolumeSpecName: "kube-api-access-xhc55") pod "eeb15868-5efc-4bc1-a297-9c4517cd23ee" (UID: "eeb15868-5efc-4bc1-a297-9c4517cd23ee"). InnerVolumeSpecName "kube-api-access-xhc55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.071295 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.078280 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" (UID: "a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.103344 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-4p5d6" event={"ID":"911d250b-4d53-4d08-aee0-a92de349d7f1","Type":"ContainerDied","Data":"b2ad5a41c9ce8abc8151adbd20e160b3e717dcfc7945128a8305501aa320c265"} Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.103371 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-4p5d6" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.103390 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2ad5a41c9ce8abc8151adbd20e160b3e717dcfc7945128a8305501aa320c265" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.121561 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-jtr9g" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.121617 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-jtr9g" event={"ID":"0df042e0-5c89-43c2-aa13-6e894851bc4b","Type":"ContainerDied","Data":"172fbbfbac1d8dff7dfb13c9b886f6c71ebb3a8e064f683a496a240847961d1a"} Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.121674 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="172fbbfbac1d8dff7dfb13c9b886f6c71ebb3a8e064f683a496a240847961d1a" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.128687 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-zdmsm" event={"ID":"a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1","Type":"ContainerDied","Data":"a6c125cd1d729e77b40e308e5200bf2480bfea1d320fce3b4f735a4f595b99b0"} Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.128748 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6c125cd1d729e77b40e308e5200bf2480bfea1d320fce3b4f735a4f595b99b0" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.128836 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-zdmsm" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.140552 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.140848 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2741-account-create-update-v7c8q" event={"ID":"eeb15868-5efc-4bc1-a297-9c4517cd23ee","Type":"ContainerDied","Data":"3c017b46d47ad6a2352d8b96a74fd09dd73f3761be05ff7a5d3421040967c832"} Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.140910 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c017b46d47ad6a2352d8b96a74fd09dd73f3761be05ff7a5d3421040967c832" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.147454 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrkt4\" (UniqueName: \"kubernetes.io/projected/1aa3343a-ce66-4872-9b9e-d011b842e4d1-kube-api-access-xrkt4\") pod \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.147538 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df042e0-5c89-43c2-aa13-6e894851bc4b-operator-scripts\") pod \"0df042e0-5c89-43c2-aa13-6e894851bc4b\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.147604 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa3343a-ce66-4872-9b9e-d011b842e4d1-operator-scripts\") pod \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\" (UID: \"1aa3343a-ce66-4872-9b9e-d011b842e4d1\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.148200 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f4jw\" (UniqueName: \"kubernetes.io/projected/0df042e0-5c89-43c2-aa13-6e894851bc4b-kube-api-access-4f4jw\") pod \"0df042e0-5c89-43c2-aa13-6e894851bc4b\" (UID: \"0df042e0-5c89-43c2-aa13-6e894851bc4b\") " Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.148727 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0df042e0-5c89-43c2-aa13-6e894851bc4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0df042e0-5c89-43c2-aa13-6e894851bc4b" (UID: "0df042e0-5c89-43c2-aa13-6e894851bc4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149338 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1aa3343a-ce66-4872-9b9e-d011b842e4d1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1aa3343a-ce66-4872-9b9e-d011b842e4d1" (UID: "1aa3343a-ce66-4872-9b9e-d011b842e4d1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149517 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2q9r\" (UniqueName: \"kubernetes.io/projected/37f44801-39b7-4ed2-b8b6-3b15e740e058-kube-api-access-r2q9r\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149548 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhc55\" (UniqueName: \"kubernetes.io/projected/eeb15868-5efc-4bc1-a297-9c4517cd23ee-kube-api-access-xhc55\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149559 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cb2g\" (UniqueName: \"kubernetes.io/projected/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-kube-api-access-8cb2g\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149571 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37f44801-39b7-4ed2-b8b6-3b15e740e058-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149585 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149594 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0df042e0-5c89-43c2-aa13-6e894851bc4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149603 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1aa3343a-ce66-4872-9b9e-d011b842e4d1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.149614 4691 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eeb15868-5efc-4bc1-a297-9c4517cd23ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.158242 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0df042e0-5c89-43c2-aa13-6e894851bc4b-kube-api-access-4f4jw" (OuterVolumeSpecName: "kube-api-access-4f4jw") pod "0df042e0-5c89-43c2-aa13-6e894851bc4b" (UID: "0df042e0-5c89-43c2-aa13-6e894851bc4b"). InnerVolumeSpecName "kube-api-access-4f4jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.158339 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa3343a-ce66-4872-9b9e-d011b842e4d1-kube-api-access-xrkt4" (OuterVolumeSpecName: "kube-api-access-xrkt4") pod "1aa3343a-ce66-4872-9b9e-d011b842e4d1" (UID: "1aa3343a-ce66-4872-9b9e-d011b842e4d1"). InnerVolumeSpecName "kube-api-access-xrkt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.158969 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.158979 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-6ba3-account-create-update-qdwjw" event={"ID":"1aa3343a-ce66-4872-9b9e-d011b842e4d1","Type":"ContainerDied","Data":"795c53fefe2c0d1748e18562b3d8e4cc04e8b2341b45f6a35477518fafd8b07c"} Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.159071 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="795c53fefe2c0d1748e18562b3d8e4cc04e8b2341b45f6a35477518fafd8b07c" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.163050 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0427-account-create-update-kprdn" event={"ID":"37f44801-39b7-4ed2-b8b6-3b15e740e058","Type":"ContainerDied","Data":"31c9114934cfbb639d45c02409216d381e4034d2fa39f6c759df3d2132fdd403"} Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.163104 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31c9114934cfbb639d45c02409216d381e4034d2fa39f6c759df3d2132fdd403" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.163185 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0427-account-create-update-kprdn" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.252217 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrkt4\" (UniqueName: \"kubernetes.io/projected/1aa3343a-ce66-4872-9b9e-d011b842e4d1-kube-api-access-xrkt4\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.252266 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f4jw\" (UniqueName: \"kubernetes.io/projected/0df042e0-5c89-43c2-aa13-6e894851bc4b-kube-api-access-4f4jw\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.271490 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:06:07 crc kubenswrapper[4691]: I1202 08:06:07.274452 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-76d578d5f5-hmcbw" Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.187001 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a7091d4-3a8f-4c52-b3e8-b35913233371","Type":"ContainerDied","Data":"b0b5d57b0a480c140403eaeb18a43bbd953094018058d8bc77a568ea99f8c0ea"} Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.187133 4691 generic.go:334] "Generic (PLEG): container finished" podID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerID="b0b5d57b0a480c140403eaeb18a43bbd953094018058d8bc77a568ea99f8c0ea" exitCode=0 Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.194351 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-central-agent" containerID="cri-o://dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6" gracePeriod=30 Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.195049 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerStarted","Data":"2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b"} Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.195130 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.195192 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="proxy-httpd" containerID="cri-o://2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b" gracePeriod=30 Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.195337 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-notification-agent" containerID="cri-o://53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6" gracePeriod=30 Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.195409 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="sg-core" containerID="cri-o://f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2" gracePeriod=30 Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.598117 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.549848037 podStartE2EDuration="10.598097871s" podCreationTimestamp="2025-12-02 08:05:58 +0000 UTC" firstStartedPulling="2025-12-02 08:05:59.64498311 +0000 UTC m=+1207.429061972" lastFinishedPulling="2025-12-02 08:06:06.693232944 +0000 UTC m=+1214.477311806" observedRunningTime="2025-12-02 08:06:08.224270762 +0000 UTC m=+1216.008349624" watchObservedRunningTime="2025-12-02 08:06:08.598097871 +0000 UTC m=+1216.382176763" Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.617572 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.617997 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-log" containerID="cri-o://07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08" gracePeriod=30 Dec 02 08:06:08 crc kubenswrapper[4691]: I1202 08:06:08.618202 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-httpd" containerID="cri-o://e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5" gracePeriod=30 Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.070709 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.191122 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7x2h7\" (UniqueName: \"kubernetes.io/projected/0a7091d4-3a8f-4c52-b3e8-b35913233371-kube-api-access-7x2h7\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.191284 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-httpd-run\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.192147 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.192233 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-combined-ca-bundle\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.192798 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-logs\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.192969 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-config-data\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.193150 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-public-tls-certs\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.193219 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-scripts\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.193275 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"0a7091d4-3a8f-4c52-b3e8-b35913233371\" (UID: \"0a7091d4-3a8f-4c52-b3e8-b35913233371\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.194053 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.201398 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a7091d4-3a8f-4c52-b3e8-b35913233371-kube-api-access-7x2h7" (OuterVolumeSpecName: "kube-api-access-7x2h7") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "kube-api-access-7x2h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.202596 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-logs" (OuterVolumeSpecName: "logs") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.209358 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-scripts" (OuterVolumeSpecName: "scripts") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.213439 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0a7091d4-3a8f-4c52-b3e8-b35913233371","Type":"ContainerDied","Data":"fa065bfcb973e8808a2302cbfb4c688ec66ef43aa9372ae722364157f5a62ef7"} Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.213511 4691 scope.go:117] "RemoveContainer" containerID="b0b5d57b0a480c140403eaeb18a43bbd953094018058d8bc77a568ea99f8c0ea" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.213674 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230088 4691 generic.go:334] "Generic (PLEG): container finished" podID="7667b055-d667-4488-a733-8f8996295fcf" containerID="2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b" exitCode=0 Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230650 4691 generic.go:334] "Generic (PLEG): container finished" podID="7667b055-d667-4488-a733-8f8996295fcf" containerID="f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2" exitCode=2 Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230665 4691 generic.go:334] "Generic (PLEG): container finished" podID="7667b055-d667-4488-a733-8f8996295fcf" containerID="53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6" exitCode=0 Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230674 4691 generic.go:334] "Generic (PLEG): container finished" podID="7667b055-d667-4488-a733-8f8996295fcf" containerID="dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6" exitCode=0 Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230744 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerDied","Data":"2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b"} Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230820 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerDied","Data":"f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2"} Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230836 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerDied","Data":"53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6"} Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.230847 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerDied","Data":"dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6"} Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.239900 4691 generic.go:334] "Generic (PLEG): container finished" podID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerID="07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08" exitCode=143 Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.240016 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f883f9e-2ece-4a76-85c8-46cca73e0796","Type":"ContainerDied","Data":"07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08"} Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.255783 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.274471 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.285039 4691 scope.go:117] "RemoveContainer" containerID="d33831a399d765d53b63507e333fdb5487277f3621ed518e8fa5a7dc61efa222" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.293159 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-config-data" (OuterVolumeSpecName: "config-data") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.303731 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-scripts\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.304002 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-sg-core-conf-yaml\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.304158 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-config-data\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.304265 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-log-httpd\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.304970 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.306030 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.306144 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7x2h7\" (UniqueName: \"kubernetes.io/projected/0a7091d4-3a8f-4c52-b3e8-b35913233371-kube-api-access-7x2h7\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.306187 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a7091d4-3a8f-4c52-b3e8-b35913233371-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.306201 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.310450 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.311259 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-scripts" (OuterVolumeSpecName: "scripts") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.320218 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.325784 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0a7091d4-3a8f-4c52-b3e8-b35913233371" (UID: "0a7091d4-3a8f-4c52-b3e8-b35913233371"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.345717 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.367035 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.407556 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-run-httpd\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.407731 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-combined-ca-bundle\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.407938 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44fl2\" (UniqueName: \"kubernetes.io/projected/7667b055-d667-4488-a733-8f8996295fcf-kube-api-access-44fl2\") pod \"7667b055-d667-4488-a733-8f8996295fcf\" (UID: \"7667b055-d667-4488-a733-8f8996295fcf\") " Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.408830 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.408829 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.408866 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.408963 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.408981 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.408999 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.409012 4691 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a7091d4-3a8f-4c52-b3e8-b35913233371-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.412816 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7667b055-d667-4488-a733-8f8996295fcf-kube-api-access-44fl2" (OuterVolumeSpecName: "kube-api-access-44fl2") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "kube-api-access-44fl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.437186 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-config-data" (OuterVolumeSpecName: "config-data") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.492935 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7667b055-d667-4488-a733-8f8996295fcf" (UID: "7667b055-d667-4488-a733-8f8996295fcf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.511251 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44fl2\" (UniqueName: \"kubernetes.io/projected/7667b055-d667-4488-a733-8f8996295fcf-kube-api-access-44fl2\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.511306 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.511324 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7667b055-d667-4488-a733-8f8996295fcf-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.511339 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7667b055-d667-4488-a733-8f8996295fcf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.559446 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.574347 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.603669 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604519 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0df042e0-5c89-43c2-aa13-6e894851bc4b" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604552 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0df042e0-5c89-43c2-aa13-6e894851bc4b" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604583 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="911d250b-4d53-4d08-aee0-a92de349d7f1" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604590 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="911d250b-4d53-4d08-aee0-a92de349d7f1" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604609 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="proxy-httpd" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604614 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="proxy-httpd" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604626 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-log" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604634 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-log" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604645 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604651 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604660 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa3343a-ce66-4872-9b9e-d011b842e4d1" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604669 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa3343a-ce66-4872-9b9e-d011b842e4d1" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604681 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="sg-core" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604687 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="sg-core" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604716 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-notification-agent" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604725 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-notification-agent" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604736 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-central-agent" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604745 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-central-agent" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604753 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37f44801-39b7-4ed2-b8b6-3b15e740e058" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604780 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="37f44801-39b7-4ed2-b8b6-3b15e740e058" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604793 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-httpd" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604801 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-httpd" Dec 02 08:06:09 crc kubenswrapper[4691]: E1202 08:06:09.604814 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb15868-5efc-4bc1-a297-9c4517cd23ee" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.604822 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb15868-5efc-4bc1-a297-9c4517cd23ee" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605020 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa3343a-ce66-4872-9b9e-d011b842e4d1" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605033 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="911d250b-4d53-4d08-aee0-a92de349d7f1" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605048 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="sg-core" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605059 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-central-agent" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605072 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb15868-5efc-4bc1-a297-9c4517cd23ee" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605083 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-httpd" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605093 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="ceilometer-notification-agent" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605109 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" containerName="glance-log" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605117 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605123 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7667b055-d667-4488-a733-8f8996295fcf" containerName="proxy-httpd" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605141 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="37f44801-39b7-4ed2-b8b6-3b15e740e058" containerName="mariadb-account-create-update" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.605150 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0df042e0-5c89-43c2-aa13-6e894851bc4b" containerName="mariadb-database-create" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.606370 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.614147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.615089 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.629207 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.716080 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-config-data\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.716153 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.716428 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.716533 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.716896 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23424898-747d-4eef-8f7e-ee64e1bf1070-logs\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.717045 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r498m\" (UniqueName: \"kubernetes.io/projected/23424898-747d-4eef-8f7e-ee64e1bf1070-kube-api-access-r498m\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.717261 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-scripts\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.717525 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23424898-747d-4eef-8f7e-ee64e1bf1070-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.820263 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r498m\" (UniqueName: \"kubernetes.io/projected/23424898-747d-4eef-8f7e-ee64e1bf1070-kube-api-access-r498m\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.820386 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-scripts\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.820482 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23424898-747d-4eef-8f7e-ee64e1bf1070-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.820583 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-config-data\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.821355 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.821367 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/23424898-747d-4eef-8f7e-ee64e1bf1070-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.821656 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.822015 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.822622 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.822914 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23424898-747d-4eef-8f7e-ee64e1bf1070-logs\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.823340 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/23424898-747d-4eef-8f7e-ee64e1bf1070-logs\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.826170 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-config-data\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.826380 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-scripts\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.827080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.831546 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/23424898-747d-4eef-8f7e-ee64e1bf1070-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.844352 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r498m\" (UniqueName: \"kubernetes.io/projected/23424898-747d-4eef-8f7e-ee64e1bf1070-kube-api-access-r498m\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.857489 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"23424898-747d-4eef-8f7e-ee64e1bf1070\") " pod="openstack/glance-default-external-api-0" Dec 02 08:06:09 crc kubenswrapper[4691]: I1202 08:06:09.926284 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.264701 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.266992 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7667b055-d667-4488-a733-8f8996295fcf","Type":"ContainerDied","Data":"ee207e61d08b823576778b13136f580f64ab79c725b424fa5ebfffd7df024d36"} Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.267086 4691 scope.go:117] "RemoveContainer" containerID="2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.315129 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.317533 4691 scope.go:117] "RemoveContainer" containerID="f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.326838 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.387078 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.392013 4691 scope.go:117] "RemoveContainer" containerID="53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.392573 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.401451 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.405345 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.411330 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.453523 4691 scope.go:117] "RemoveContainer" containerID="dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544293 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n2qp\" (UniqueName: \"kubernetes.io/projected/4786bf63-8e3b-4048-8fde-998c8ce209a1-kube-api-access-2n2qp\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544357 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544375 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544397 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544430 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-config-data\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544533 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.544587 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-scripts\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.574199 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a7091d4-3a8f-4c52-b3e8-b35913233371" path="/var/lib/kubelet/pods/0a7091d4-3a8f-4c52-b3e8-b35913233371/volumes" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.574909 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7667b055-d667-4488-a733-8f8996295fcf" path="/var/lib/kubelet/pods/7667b055-d667-4488-a733-8f8996295fcf/volumes" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.598576 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646050 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-scripts\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646151 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n2qp\" (UniqueName: \"kubernetes.io/projected/4786bf63-8e3b-4048-8fde-998c8ce209a1-kube-api-access-2n2qp\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646220 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646246 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646271 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646307 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-config-data\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.646353 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.648547 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-log-httpd\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.648888 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-run-httpd\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.655835 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.659658 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-config-data\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.695919 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.704580 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-scripts\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.712836 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n2qp\" (UniqueName: \"kubernetes.io/projected/4786bf63-8e3b-4048-8fde-998c8ce209a1-kube-api-access-2n2qp\") pod \"ceilometer-0\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " pod="openstack/ceilometer-0" Dec 02 08:06:10 crc kubenswrapper[4691]: I1202 08:06:10.733297 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:11 crc kubenswrapper[4691]: I1202 08:06:11.162977 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:11 crc kubenswrapper[4691]: I1202 08:06:11.276844 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:11 crc kubenswrapper[4691]: I1202 08:06:11.301036 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23424898-747d-4eef-8f7e-ee64e1bf1070","Type":"ContainerStarted","Data":"fd1c99f0bf0d35c9b029b9472cc2bbc9686afa946c197916eead06728482a88a"} Dec 02 08:06:11 crc kubenswrapper[4691]: I1202 08:06:11.320604 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.988586 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-ee207e61d08b823576778b13136f580f64ab79c725b424fa5ebfffd7df024d36": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-ee207e61d08b823576778b13136f580f64ab79c725b424fa5ebfffd7df024d36: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989029 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989048 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-dc6781185d6c41cb42ead430c31cf45e2db3bad2b82e8b6fafcf30604b644cd6.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989076 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod911d250b_4d53_4d08_aee0_a92de349d7f1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod911d250b_4d53_4d08_aee0_a92de349d7f1.slice: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989095 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b0b24f_61f2_4abf_8dd5_6cecb20ac4c1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5b0b24f_61f2_4abf_8dd5_6cecb20ac4c1.slice: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989114 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f44801_39b7_4ed2_b8b6_3b15e740e058.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37f44801_39b7_4ed2_b8b6_3b15e740e058.slice: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989131 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0df042e0_5c89_43c2_aa13_6e894851bc4b.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0df042e0_5c89_43c2_aa13_6e894851bc4b.slice: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989150 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa3343a_ce66_4872_9b9e_d011b842e4d1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa3343a_ce66_4872_9b9e_d011b842e4d1.slice: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989170 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb15868_5efc_4bc1_a297_9c4517cd23ee.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeeb15868_5efc_4bc1_a297_9c4517cd23ee.slice: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989187 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989202 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-53fb2ecf8b5408a7fcc226c3154c9f24fdc1880cc551414de29eb70896598dd6.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989293 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989316 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-f8fa2503c2f1cd80b686c9edce62c90a6b298c0a47fc077f20ee3880c8341dd2.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989346 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-conmon-2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b.scope: no such file or directory Dec 02 08:06:11 crc kubenswrapper[4691]: W1202 08:06:11.989365 4691 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7667b055_d667_4488_a733_8f8996295fcf.slice/crio-2711c3c37797ea7250c003f547df3243a84df00ef2d8258cb2ba9c3404205c7b.scope: no such file or directory Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.143805 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8dbg"] Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.146333 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.150378 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-k2mqh" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.150487 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.150742 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.183706 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8dbg"] Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.407530 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-scripts\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.408015 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-config-data\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.408125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.408359 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndnrl\" (UniqueName: \"kubernetes.io/projected/ecc9085a-c030-4fd8-bb83-ad19b91315ba-kube-api-access-ndnrl\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.437940 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23424898-747d-4eef-8f7e-ee64e1bf1070","Type":"ContainerStarted","Data":"a02b2851146d61aca22f0e1f42e51d4a987241fb5d3adacc94114f0248e5f824"} Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.440084 4691 generic.go:334] "Generic (PLEG): container finished" podID="a95b5239-be71-4b06-88b2-52875915162e" containerID="e8af3edd4341dbc165ca8c818abe8117ef90323a541e8dbd9510d558ed71bb9f" exitCode=137 Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.440135 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758b4cf594-fpkds" event={"ID":"a95b5239-be71-4b06-88b2-52875915162e","Type":"ContainerDied","Data":"e8af3edd4341dbc165ca8c818abe8117ef90323a541e8dbd9510d558ed71bb9f"} Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.440938 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerStarted","Data":"75a1bd64eddc2db68fe390a0014bb1e0be12a1133550a7bbb9ad0ebe27c0b263"} Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.521524 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-scripts\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.521687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-config-data\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.521789 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.521903 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndnrl\" (UniqueName: \"kubernetes.io/projected/ecc9085a-c030-4fd8-bb83-ad19b91315ba-kube-api-access-ndnrl\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.537377 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-config-data\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.550745 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-scripts\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.551480 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.562467 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndnrl\" (UniqueName: \"kubernetes.io/projected/ecc9085a-c030-4fd8-bb83-ad19b91315ba-kube-api-access-ndnrl\") pod \"nova-cell0-conductor-db-sync-p8dbg\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: E1202 08:06:12.708906 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f883f9e_2ece_4a76_85c8_46cca73e0796.slice/crio-e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f883f9e_2ece_4a76_85c8_46cca73e0796.slice/crio-conmon-e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.795887 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.799150 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.945726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-config-data\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.945819 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-tls-certs\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.945851 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a95b5239-be71-4b06-88b2-52875915162e-logs\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.945914 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg54q\" (UniqueName: \"kubernetes.io/projected/a95b5239-be71-4b06-88b2-52875915162e-kube-api-access-kg54q\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.946007 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-scripts\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.946031 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-combined-ca-bundle\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.946181 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-secret-key\") pod \"a95b5239-be71-4b06-88b2-52875915162e\" (UID: \"a95b5239-be71-4b06-88b2-52875915162e\") " Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.952404 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a95b5239-be71-4b06-88b2-52875915162e-logs" (OuterVolumeSpecName: "logs") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.967106 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95b5239-be71-4b06-88b2-52875915162e-kube-api-access-kg54q" (OuterVolumeSpecName: "kube-api-access-kg54q") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "kube-api-access-kg54q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.968343 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.981510 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-config-data" (OuterVolumeSpecName: "config-data") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:12 crc kubenswrapper[4691]: I1202 08:06:12.991128 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-scripts" (OuterVolumeSpecName: "scripts") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.004541 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.045485 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a95b5239-be71-4b06-88b2-52875915162e" (UID: "a95b5239-be71-4b06-88b2-52875915162e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.048984 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.049033 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.049049 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.049058 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a95b5239-be71-4b06-88b2-52875915162e-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.049070 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a95b5239-be71-4b06-88b2-52875915162e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.049080 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a95b5239-be71-4b06-88b2-52875915162e-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.049088 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg54q\" (UniqueName: \"kubernetes.io/projected/a95b5239-be71-4b06-88b2-52875915162e-kube-api-access-kg54q\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.076109 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.251499 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-httpd-run\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.251962 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-internal-tls-certs\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.251985 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-config-data\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252081 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-combined-ca-bundle\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252229 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z22j\" (UniqueName: \"kubernetes.io/projected/0f883f9e-2ece-4a76-85c8-46cca73e0796-kube-api-access-2z22j\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252256 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-logs\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252276 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-scripts\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"0f883f9e-2ece-4a76-85c8-46cca73e0796\" (UID: \"0f883f9e-2ece-4a76-85c8-46cca73e0796\") " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252463 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.252745 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.253947 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-logs" (OuterVolumeSpecName: "logs") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.260041 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-scripts" (OuterVolumeSpecName: "scripts") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.263459 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f883f9e-2ece-4a76-85c8-46cca73e0796-kube-api-access-2z22j" (OuterVolumeSpecName: "kube-api-access-2z22j") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "kube-api-access-2z22j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.276156 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.310331 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.355122 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.355177 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z22j\" (UniqueName: \"kubernetes.io/projected/0f883f9e-2ece-4a76-85c8-46cca73e0796-kube-api-access-2z22j\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.355194 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0f883f9e-2ece-4a76-85c8-46cca73e0796-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.355205 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.355233 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.358308 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-config-data" (OuterVolumeSpecName: "config-data") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.379376 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "0f883f9e-2ece-4a76-85c8-46cca73e0796" (UID: "0f883f9e-2ece-4a76-85c8-46cca73e0796"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.396615 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.474995 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.475050 4691 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0f883f9e-2ece-4a76-85c8-46cca73e0796-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.475065 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.479776 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-758b4cf594-fpkds" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.480151 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-758b4cf594-fpkds" event={"ID":"a95b5239-be71-4b06-88b2-52875915162e","Type":"ContainerDied","Data":"43b683a099f5e7a29c140d75614e74d4d36686f4d8c86ef07a7a20205f8acc30"} Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.480234 4691 scope.go:117] "RemoveContainer" containerID="fe343d6838838b4d379b0b348314566df9f6e06b542365de413698b6e144a093" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.484226 4691 generic.go:334] "Generic (PLEG): container finished" podID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerID="e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5" exitCode=0 Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.484285 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f883f9e-2ece-4a76-85c8-46cca73e0796","Type":"ContainerDied","Data":"e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5"} Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.484304 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0f883f9e-2ece-4a76-85c8-46cca73e0796","Type":"ContainerDied","Data":"afae584a0dad11072297679c5ef29d923e1ff38328f7f242107907ce5bd2c291"} Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.484431 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.490386 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerStarted","Data":"ac14a8cebd6608e88c9c7bee3bc1cacb33a093455b9a380ad2e110885be88ccb"} Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.496683 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"23424898-747d-4eef-8f7e-ee64e1bf1070","Type":"ContainerStarted","Data":"dd7cfb348f7a010cc5b9429a4755da86eeb2c8e8a725908f4fc3e8f2d2f18ea7"} Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.529322 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8dbg"] Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.549206 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-758b4cf594-fpkds"] Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.576559 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-758b4cf594-fpkds"] Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.593439 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.607283 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.633341 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:06:13 crc kubenswrapper[4691]: E1202 08:06:13.633971 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon-log" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634004 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon-log" Dec 02 08:06:13 crc kubenswrapper[4691]: E1202 08:06:13.634028 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-httpd" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634041 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-httpd" Dec 02 08:06:13 crc kubenswrapper[4691]: E1202 08:06:13.634054 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634063 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" Dec 02 08:06:13 crc kubenswrapper[4691]: E1202 08:06:13.634078 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-log" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634085 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-log" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634320 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-log" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634340 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" containerName="glance-httpd" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634367 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634383 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95b5239-be71-4b06-88b2-52875915162e" containerName="horizon-log" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.634434 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.634416336 podStartE2EDuration="4.634416336s" podCreationTimestamp="2025-12-02 08:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:13.576782419 +0000 UTC m=+1221.360861271" watchObservedRunningTime="2025-12-02 08:06:13.634416336 +0000 UTC m=+1221.418495198" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.635624 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.642304 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.642576 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.660561 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.717481 4691 scope.go:117] "RemoveContainer" containerID="e8af3edd4341dbc165ca8c818abe8117ef90323a541e8dbd9510d558ed71bb9f" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780044 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780116 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780208 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780306 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780336 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0837fb6a-ad2a-4110-bec4-727f9daa999c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780401 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j72tp\" (UniqueName: \"kubernetes.io/projected/0837fb6a-ad2a-4110-bec4-727f9daa999c-kube-api-access-j72tp\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780470 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.780546 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0837fb6a-ad2a-4110-bec4-727f9daa999c-logs\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.786121 4691 scope.go:117] "RemoveContainer" containerID="e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.882452 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0837fb6a-ad2a-4110-bec4-727f9daa999c-logs\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883005 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883053 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883090 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883141 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0837fb6a-ad2a-4110-bec4-727f9daa999c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883228 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j72tp\" (UniqueName: \"kubernetes.io/projected/0837fb6a-ad2a-4110-bec4-727f9daa999c-kube-api-access-j72tp\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.883295 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.886320 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.886393 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0837fb6a-ad2a-4110-bec4-727f9daa999c-logs\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.890567 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0837fb6a-ad2a-4110-bec4-727f9daa999c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.893387 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.893753 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.897193 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.899507 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0837fb6a-ad2a-4110-bec4-727f9daa999c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.910381 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j72tp\" (UniqueName: \"kubernetes.io/projected/0837fb6a-ad2a-4110-bec4-727f9daa999c-kube-api-access-j72tp\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.943751 4691 scope.go:117] "RemoveContainer" containerID="07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.949942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"0837fb6a-ad2a-4110-bec4-727f9daa999c\") " pod="openstack/glance-default-internal-api-0" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.999005 4691 scope.go:117] "RemoveContainer" containerID="e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5" Dec 02 08:06:13 crc kubenswrapper[4691]: E1202 08:06:13.999587 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5\": container with ID starting with e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5 not found: ID does not exist" containerID="e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.999623 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5"} err="failed to get container status \"e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5\": rpc error: code = NotFound desc = could not find container \"e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5\": container with ID starting with e59e4e4687009e62bc721a0a63934001fda64c84d4978c856a16b80871ca29f5 not found: ID does not exist" Dec 02 08:06:13 crc kubenswrapper[4691]: I1202 08:06:13.999677 4691 scope.go:117] "RemoveContainer" containerID="07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08" Dec 02 08:06:14 crc kubenswrapper[4691]: E1202 08:06:14.000151 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08\": container with ID starting with 07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08 not found: ID does not exist" containerID="07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08" Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.000174 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08"} err="failed to get container status \"07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08\": rpc error: code = NotFound desc = could not find container \"07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08\": container with ID starting with 07c8385715cb10f58e64030a4979bfb46679591e2e75a8ea23d6adaa94965b08 not found: ID does not exist" Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.009857 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.514107 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" event={"ID":"ecc9085a-c030-4fd8-bb83-ad19b91315ba","Type":"ContainerStarted","Data":"a17ee3b7eac82b787420c8050b31b98dbad409549571ddb6a9d2fef6d2b61a74"} Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.667220 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f883f9e-2ece-4a76-85c8-46cca73e0796" path="/var/lib/kubelet/pods/0f883f9e-2ece-4a76-85c8-46cca73e0796/volumes" Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.668948 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95b5239-be71-4b06-88b2-52875915162e" path="/var/lib/kubelet/pods/a95b5239-be71-4b06-88b2-52875915162e/volumes" Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.669734 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerStarted","Data":"92e64b428202ebd25163bfe45406441fea2324134562ff64e17067fa7e35cf56"} Dec 02 08:06:14 crc kubenswrapper[4691]: I1202 08:06:14.775289 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 02 08:06:15 crc kubenswrapper[4691]: I1202 08:06:15.623509 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0837fb6a-ad2a-4110-bec4-727f9daa999c","Type":"ContainerStarted","Data":"5b5f5cd29a67c064764cf21cfefda37f426c2f46093190da128ee3bc5752e10a"} Dec 02 08:06:15 crc kubenswrapper[4691]: I1202 08:06:15.623908 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0837fb6a-ad2a-4110-bec4-727f9daa999c","Type":"ContainerStarted","Data":"0f57ca93f7ddf5d2e5a271f352435f4047a9f7c0765ed73fbfc13d1a184bd4b5"} Dec 02 08:06:15 crc kubenswrapper[4691]: I1202 08:06:15.640320 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerStarted","Data":"9e0caeef111b9bf8610f7d44a4ef6b45f123ca5ef0e5c3e119c31f31a26cc7d3"} Dec 02 08:06:16 crc kubenswrapper[4691]: I1202 08:06:16.658843 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"0837fb6a-ad2a-4110-bec4-727f9daa999c","Type":"ContainerStarted","Data":"c6b7299c9086d8cd320d47554075a8770f155f0e06f688e71ab73ace712d3e41"} Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.674597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerStarted","Data":"cec882e6a038d7a08683ae117324c10a692114e1bcebaca65f34de4b5e402a50"} Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.674697 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-central-agent" containerID="cri-o://ac14a8cebd6608e88c9c7bee3bc1cacb33a093455b9a380ad2e110885be88ccb" gracePeriod=30 Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.674749 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="proxy-httpd" containerID="cri-o://cec882e6a038d7a08683ae117324c10a692114e1bcebaca65f34de4b5e402a50" gracePeriod=30 Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.674776 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="sg-core" containerID="cri-o://9e0caeef111b9bf8610f7d44a4ef6b45f123ca5ef0e5c3e119c31f31a26cc7d3" gracePeriod=30 Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.675149 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.674787 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-notification-agent" containerID="cri-o://92e64b428202ebd25163bfe45406441fea2324134562ff64e17067fa7e35cf56" gracePeriod=30 Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.701511 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.087657994 podStartE2EDuration="7.701486665s" podCreationTimestamp="2025-12-02 08:06:10 +0000 UTC" firstStartedPulling="2025-12-02 08:06:11.320251489 +0000 UTC m=+1219.104330351" lastFinishedPulling="2025-12-02 08:06:16.93408016 +0000 UTC m=+1224.718159022" observedRunningTime="2025-12-02 08:06:17.697521658 +0000 UTC m=+1225.481600520" watchObservedRunningTime="2025-12-02 08:06:17.701486665 +0000 UTC m=+1225.485565517" Dec 02 08:06:17 crc kubenswrapper[4691]: I1202 08:06:17.709785 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.709739268 podStartE2EDuration="4.709739268s" podCreationTimestamp="2025-12-02 08:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:16.695257709 +0000 UTC m=+1224.479336591" watchObservedRunningTime="2025-12-02 08:06:17.709739268 +0000 UTC m=+1225.493818130" Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749239 4691 generic.go:334] "Generic (PLEG): container finished" podID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerID="cec882e6a038d7a08683ae117324c10a692114e1bcebaca65f34de4b5e402a50" exitCode=0 Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749697 4691 generic.go:334] "Generic (PLEG): container finished" podID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerID="9e0caeef111b9bf8610f7d44a4ef6b45f123ca5ef0e5c3e119c31f31a26cc7d3" exitCode=2 Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749713 4691 generic.go:334] "Generic (PLEG): container finished" podID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerID="92e64b428202ebd25163bfe45406441fea2324134562ff64e17067fa7e35cf56" exitCode=0 Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749724 4691 generic.go:334] "Generic (PLEG): container finished" podID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerID="ac14a8cebd6608e88c9c7bee3bc1cacb33a093455b9a380ad2e110885be88ccb" exitCode=0 Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749507 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerDied","Data":"cec882e6a038d7a08683ae117324c10a692114e1bcebaca65f34de4b5e402a50"} Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749860 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerDied","Data":"9e0caeef111b9bf8610f7d44a4ef6b45f123ca5ef0e5c3e119c31f31a26cc7d3"} Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749896 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerDied","Data":"92e64b428202ebd25163bfe45406441fea2324134562ff64e17067fa7e35cf56"} Dec 02 08:06:18 crc kubenswrapper[4691]: I1202 08:06:18.749913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerDied","Data":"ac14a8cebd6608e88c9c7bee3bc1cacb33a093455b9a380ad2e110885be88ccb"} Dec 02 08:06:19 crc kubenswrapper[4691]: I1202 08:06:19.926661 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 08:06:19 crc kubenswrapper[4691]: I1202 08:06:19.926746 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 02 08:06:19 crc kubenswrapper[4691]: I1202 08:06:19.961785 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 08:06:19 crc kubenswrapper[4691]: I1202 08:06:19.976515 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 02 08:06:20 crc kubenswrapper[4691]: I1202 08:06:20.771260 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:06:20 crc kubenswrapper[4691]: I1202 08:06:20.771340 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 02 08:06:21 crc kubenswrapper[4691]: I1202 08:06:21.898792 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:06:21 crc kubenswrapper[4691]: I1202 08:06:21.898884 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.130263 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.131067 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.324901 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.567846 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.677665 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-run-httpd\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.677824 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-sg-core-conf-yaml\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.677866 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n2qp\" (UniqueName: \"kubernetes.io/projected/4786bf63-8e3b-4048-8fde-998c8ce209a1-kube-api-access-2n2qp\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.677928 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-log-httpd\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678026 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-combined-ca-bundle\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678123 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-config-data\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678206 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-scripts\") pod \"4786bf63-8e3b-4048-8fde-998c8ce209a1\" (UID: \"4786bf63-8e3b-4048-8fde-998c8ce209a1\") " Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678330 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678496 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678658 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.678684 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4786bf63-8e3b-4048-8fde-998c8ce209a1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.688929 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-scripts" (OuterVolumeSpecName: "scripts") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.689044 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4786bf63-8e3b-4048-8fde-998c8ce209a1-kube-api-access-2n2qp" (OuterVolumeSpecName: "kube-api-access-2n2qp") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "kube-api-access-2n2qp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.716526 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.767978 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.782213 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.782284 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.782303 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n2qp\" (UniqueName: \"kubernetes.io/projected/4786bf63-8e3b-4048-8fde-998c8ce209a1-kube-api-access-2n2qp\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.782319 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.809958 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-config-data" (OuterVolumeSpecName: "config-data") pod "4786bf63-8e3b-4048-8fde-998c8ce209a1" (UID: "4786bf63-8e3b-4048-8fde-998c8ce209a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.821045 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4786bf63-8e3b-4048-8fde-998c8ce209a1","Type":"ContainerDied","Data":"75a1bd64eddc2db68fe390a0014bb1e0be12a1133550a7bbb9ad0ebe27c0b263"} Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.821098 4691 scope.go:117] "RemoveContainer" containerID="cec882e6a038d7a08683ae117324c10a692114e1bcebaca65f34de4b5e402a50" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.821237 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.843806 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" event={"ID":"ecc9085a-c030-4fd8-bb83-ad19b91315ba","Type":"ContainerStarted","Data":"feb8f9c0f6ecbf6f086af84ef48419cd78a9a8cc4e31332a3143c405b1fc6a1b"} Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.879290 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.885749 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4786bf63-8e3b-4048-8fde-998c8ce209a1-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.955927 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.961170 4691 scope.go:117] "RemoveContainer" containerID="9e0caeef111b9bf8610f7d44a4ef6b45f123ca5ef0e5c3e119c31f31a26cc7d3" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.965299 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" podStartSLOduration=2.483989911 podStartE2EDuration="11.965240984s" podCreationTimestamp="2025-12-02 08:06:12 +0000 UTC" firstStartedPulling="2025-12-02 08:06:13.738331551 +0000 UTC m=+1221.522410423" lastFinishedPulling="2025-12-02 08:06:23.219582634 +0000 UTC m=+1231.003661496" observedRunningTime="2025-12-02 08:06:23.872235748 +0000 UTC m=+1231.656314610" watchObservedRunningTime="2025-12-02 08:06:23.965240984 +0000 UTC m=+1231.749319846" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.987877 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:23 crc kubenswrapper[4691]: E1202 08:06:23.989438 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-notification-agent" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.989543 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-notification-agent" Dec 02 08:06:23 crc kubenswrapper[4691]: E1202 08:06:23.989626 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="sg-core" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.989716 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="sg-core" Dec 02 08:06:23 crc kubenswrapper[4691]: E1202 08:06:23.989883 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-central-agent" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.989977 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-central-agent" Dec 02 08:06:23 crc kubenswrapper[4691]: E1202 08:06:23.990073 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="proxy-httpd" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.990156 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="proxy-httpd" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.990482 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="sg-core" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.990573 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-notification-agent" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.990663 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="proxy-httpd" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.990747 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" containerName="ceilometer-central-agent" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.993446 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.997721 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:06:23 crc kubenswrapper[4691]: I1202 08:06:23.998028 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:23.998990 4691 scope.go:117] "RemoveContainer" containerID="92e64b428202ebd25163bfe45406441fea2324134562ff64e17067fa7e35cf56" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.003949 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.011073 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.011123 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.039677 4691 scope.go:117] "RemoveContainer" containerID="ac14a8cebd6608e88c9c7bee3bc1cacb33a093455b9a380ad2e110885be88ccb" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.066719 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.071223 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089162 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089273 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-run-httpd\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089320 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-scripts\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089341 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-log-httpd\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089392 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089427 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-config-data\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.089498 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wg5r\" (UniqueName: \"kubernetes.io/projected/133993ed-c8ce-4a04-8646-defeb53b9904-kube-api-access-6wg5r\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.192383 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wg5r\" (UniqueName: \"kubernetes.io/projected/133993ed-c8ce-4a04-8646-defeb53b9904-kube-api-access-6wg5r\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.192575 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.193880 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-run-httpd\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.193953 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-scripts\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.193987 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-log-httpd\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.194145 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.194205 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-config-data\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.197645 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-log-httpd\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.197666 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-run-httpd\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.201367 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-config-data\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.202793 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-scripts\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.204031 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.209022 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.210376 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wg5r\" (UniqueName: \"kubernetes.io/projected/133993ed-c8ce-4a04-8646-defeb53b9904-kube-api-access-6wg5r\") pod \"ceilometer-0\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.323143 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.596623 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4786bf63-8e3b-4048-8fde-998c8ce209a1" path="/var/lib/kubelet/pods/4786bf63-8e3b-4048-8fde-998c8ce209a1/volumes" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.853312 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.853351 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:24 crc kubenswrapper[4691]: W1202 08:06:24.890294 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod133993ed_c8ce_4a04_8646_defeb53b9904.slice/crio-658b46612fd4bf193ade29cb33202b629494555d579f3c08ce16307aafeeb499 WatchSource:0}: Error finding container 658b46612fd4bf193ade29cb33202b629494555d579f3c08ce16307aafeeb499: Status 404 returned error can't find the container with id 658b46612fd4bf193ade29cb33202b629494555d579f3c08ce16307aafeeb499 Dec 02 08:06:24 crc kubenswrapper[4691]: I1202 08:06:24.899282 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:06:25 crc kubenswrapper[4691]: I1202 08:06:25.873965 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerStarted","Data":"aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6"} Dec 02 08:06:25 crc kubenswrapper[4691]: I1202 08:06:25.874598 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerStarted","Data":"658b46612fd4bf193ade29cb33202b629494555d579f3c08ce16307aafeeb499"} Dec 02 08:06:26 crc kubenswrapper[4691]: I1202 08:06:26.884749 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerStarted","Data":"09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623"} Dec 02 08:06:27 crc kubenswrapper[4691]: I1202 08:06:27.361721 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:27 crc kubenswrapper[4691]: I1202 08:06:27.362122 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 02 08:06:27 crc kubenswrapper[4691]: I1202 08:06:27.364108 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 02 08:06:27 crc kubenswrapper[4691]: I1202 08:06:27.897096 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerStarted","Data":"07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80"} Dec 02 08:06:29 crc kubenswrapper[4691]: I1202 08:06:29.929734 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerStarted","Data":"154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2"} Dec 02 08:06:29 crc kubenswrapper[4691]: I1202 08:06:29.932106 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:06:29 crc kubenswrapper[4691]: I1202 08:06:29.957236 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.137104141 podStartE2EDuration="6.957203426s" podCreationTimestamp="2025-12-02 08:06:23 +0000 UTC" firstStartedPulling="2025-12-02 08:06:24.893174335 +0000 UTC m=+1232.677253197" lastFinishedPulling="2025-12-02 08:06:28.71327362 +0000 UTC m=+1236.497352482" observedRunningTime="2025-12-02 08:06:29.950714204 +0000 UTC m=+1237.734793066" watchObservedRunningTime="2025-12-02 08:06:29.957203426 +0000 UTC m=+1237.741282288" Dec 02 08:06:35 crc kubenswrapper[4691]: I1202 08:06:35.984858 4691 generic.go:334] "Generic (PLEG): container finished" podID="ecc9085a-c030-4fd8-bb83-ad19b91315ba" containerID="feb8f9c0f6ecbf6f086af84ef48419cd78a9a8cc4e31332a3143c405b1fc6a1b" exitCode=0 Dec 02 08:06:35 crc kubenswrapper[4691]: I1202 08:06:35.984933 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" event={"ID":"ecc9085a-c030-4fd8-bb83-ad19b91315ba","Type":"ContainerDied","Data":"feb8f9c0f6ecbf6f086af84ef48419cd78a9a8cc4e31332a3143c405b1fc6a1b"} Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.323736 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.490324 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndnrl\" (UniqueName: \"kubernetes.io/projected/ecc9085a-c030-4fd8-bb83-ad19b91315ba-kube-api-access-ndnrl\") pod \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.490848 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-config-data\") pod \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.490989 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-scripts\") pod \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.491077 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-combined-ca-bundle\") pod \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\" (UID: \"ecc9085a-c030-4fd8-bb83-ad19b91315ba\") " Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.497353 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc9085a-c030-4fd8-bb83-ad19b91315ba-kube-api-access-ndnrl" (OuterVolumeSpecName: "kube-api-access-ndnrl") pod "ecc9085a-c030-4fd8-bb83-ad19b91315ba" (UID: "ecc9085a-c030-4fd8-bb83-ad19b91315ba"). InnerVolumeSpecName "kube-api-access-ndnrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.516813 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-scripts" (OuterVolumeSpecName: "scripts") pod "ecc9085a-c030-4fd8-bb83-ad19b91315ba" (UID: "ecc9085a-c030-4fd8-bb83-ad19b91315ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.527453 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-config-data" (OuterVolumeSpecName: "config-data") pod "ecc9085a-c030-4fd8-bb83-ad19b91315ba" (UID: "ecc9085a-c030-4fd8-bb83-ad19b91315ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.553968 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ecc9085a-c030-4fd8-bb83-ad19b91315ba" (UID: "ecc9085a-c030-4fd8-bb83-ad19b91315ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.593402 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.593444 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.593461 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ecc9085a-c030-4fd8-bb83-ad19b91315ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:37 crc kubenswrapper[4691]: I1202 08:06:37.593478 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndnrl\" (UniqueName: \"kubernetes.io/projected/ecc9085a-c030-4fd8-bb83-ad19b91315ba-kube-api-access-ndnrl\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.005305 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" event={"ID":"ecc9085a-c030-4fd8-bb83-ad19b91315ba","Type":"ContainerDied","Data":"a17ee3b7eac82b787420c8050b31b98dbad409549571ddb6a9d2fef6d2b61a74"} Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.005356 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17ee3b7eac82b787420c8050b31b98dbad409549571ddb6a9d2fef6d2b61a74" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.005381 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-p8dbg" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.111669 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:06:38 crc kubenswrapper[4691]: E1202 08:06:38.112371 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc9085a-c030-4fd8-bb83-ad19b91315ba" containerName="nova-cell0-conductor-db-sync" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.112398 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc9085a-c030-4fd8-bb83-ad19b91315ba" containerName="nova-cell0-conductor-db-sync" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.112638 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc9085a-c030-4fd8-bb83-ad19b91315ba" containerName="nova-cell0-conductor-db-sync" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.113514 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.118519 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-k2mqh" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.118745 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.128622 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.210545 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvbk\" (UniqueName: \"kubernetes.io/projected/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-kube-api-access-qvvbk\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.210601 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.210982 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.312984 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.313125 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.314091 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvvbk\" (UniqueName: \"kubernetes.io/projected/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-kube-api-access-qvvbk\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.330688 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.330874 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.335347 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvvbk\" (UniqueName: \"kubernetes.io/projected/95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437-kube-api-access-qvvbk\") pod \"nova-cell0-conductor-0\" (UID: \"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437\") " pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.451945 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:38 crc kubenswrapper[4691]: I1202 08:06:38.906283 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 02 08:06:39 crc kubenswrapper[4691]: I1202 08:06:39.015094 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437","Type":"ContainerStarted","Data":"37cae4dfd7dc2d3f8e3268cf07fb7b3008468d9bf175e1e74310a7dfd648cd7e"} Dec 02 08:06:40 crc kubenswrapper[4691]: I1202 08:06:40.028087 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437","Type":"ContainerStarted","Data":"27896f2bf7c0dc3aacb5c6ef77ceaaa06258479c886dacae2285818ffa582d3f"} Dec 02 08:06:40 crc kubenswrapper[4691]: I1202 08:06:40.028431 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:40 crc kubenswrapper[4691]: I1202 08:06:40.049746 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.049721202 podStartE2EDuration="2.049721202s" podCreationTimestamp="2025-12-02 08:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:40.044062701 +0000 UTC m=+1247.828141573" watchObservedRunningTime="2025-12-02 08:06:40.049721202 +0000 UTC m=+1247.833800064" Dec 02 08:06:48 crc kubenswrapper[4691]: I1202 08:06:48.479280 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 02 08:06:48 crc kubenswrapper[4691]: I1202 08:06:48.935697 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-bdd2r"] Dec 02 08:06:48 crc kubenswrapper[4691]: I1202 08:06:48.937730 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:48 crc kubenswrapper[4691]: I1202 08:06:48.940569 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 02 08:06:48 crc kubenswrapper[4691]: I1202 08:06:48.941065 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 02 08:06:48 crc kubenswrapper[4691]: I1202 08:06:48.948753 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bdd2r"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.072812 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm5cj\" (UniqueName: \"kubernetes.io/projected/90e610cf-6ebe-4904-afe8-749d466fa6eb-kube-api-access-sm5cj\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.072911 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-scripts\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.072973 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.073035 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-config-data\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.177155 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm5cj\" (UniqueName: \"kubernetes.io/projected/90e610cf-6ebe-4904-afe8-749d466fa6eb-kube-api-access-sm5cj\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.177422 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-scripts\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.177544 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.177651 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-config-data\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.188685 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-config-data\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.234816 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.237209 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-scripts\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.326604 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm5cj\" (UniqueName: \"kubernetes.io/projected/90e610cf-6ebe-4904-afe8-749d466fa6eb-kube-api-access-sm5cj\") pod \"nova-cell0-cell-mapping-bdd2r\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.328881 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.330809 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.337666 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.353239 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.476098 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.498096 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.500215 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4kb7\" (UniqueName: \"kubernetes.io/projected/bee234f2-733f-4011-8cbc-baa61d2323c0-kube-api-access-z4kb7\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.500282 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee234f2-733f-4011-8cbc-baa61d2323c0-logs\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.500317 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.500380 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-config-data\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.510079 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.522696 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.576983 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610228 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-config-data\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610330 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae46655-0230-41ca-a576-10924967589b-logs\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610364 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610411 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxt2\" (UniqueName: \"kubernetes.io/projected/dae46655-0230-41ca-a576-10924967589b-kube-api-access-8bxt2\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610460 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-config-data\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610484 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4kb7\" (UniqueName: \"kubernetes.io/projected/bee234f2-733f-4011-8cbc-baa61d2323c0-kube-api-access-z4kb7\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610510 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee234f2-733f-4011-8cbc-baa61d2323c0-logs\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.610530 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.612506 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee234f2-733f-4011-8cbc-baa61d2323c0-logs\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.627166 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-config-data\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.627572 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.645453 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4kb7\" (UniqueName: \"kubernetes.io/projected/bee234f2-733f-4011-8cbc-baa61d2323c0-kube-api-access-z4kb7\") pod \"nova-api-0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.655827 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.657205 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.669156 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712207 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae46655-0230-41ca-a576-10924967589b-logs\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712274 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712336 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxt2\" (UniqueName: \"kubernetes.io/projected/dae46655-0230-41ca-a576-10924967589b-kube-api-access-8bxt2\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712362 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-config-data\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712423 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-config-data\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712445 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jczt8\" (UniqueName: \"kubernetes.io/projected/ac0c01ba-70df-4e67-9582-2984e9fb292c-kube-api-access-jczt8\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712495 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.712986 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae46655-0230-41ca-a576-10924967589b-logs\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.724960 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.737575 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-config-data\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.739558 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.754431 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxt2\" (UniqueName: \"kubernetes.io/projected/dae46655-0230-41ca-a576-10924967589b-kube-api-access-8bxt2\") pod \"nova-metadata-0\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.754513 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.816615 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-config-data\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.816699 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jczt8\" (UniqueName: \"kubernetes.io/projected/ac0c01ba-70df-4e67-9582-2984e9fb292c-kube-api-access-jczt8\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.816747 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.825143 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.827334 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-config-data\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.827403 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckjgd"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.829244 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.853454 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jczt8\" (UniqueName: \"kubernetes.io/projected/ac0c01ba-70df-4e67-9582-2984e9fb292c-kube-api-access-jczt8\") pod \"nova-scheduler-0\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " pod="openstack/nova-scheduler-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.862014 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckjgd"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.888289 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.918821 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.918987 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.919042 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.919158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.919218 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-config\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.919317 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.919362 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lbkz\" (UniqueName: \"kubernetes.io/projected/5402d84d-ce2a-4a49-810e-600f6bfa5fef-kube-api-access-9lbkz\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.920573 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.923714 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 08:06:49 crc kubenswrapper[4691]: I1202 08:06:49.938748 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022540 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022610 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lbkz\" (UniqueName: \"kubernetes.io/projected/5402d84d-ce2a-4a49-810e-600f6bfa5fef-kube-api-access-9lbkz\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022643 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022670 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-kube-api-access-ch5bd\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022856 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022895 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022930 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022955 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.022996 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-config\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.023744 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.024081 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-config\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.024548 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-svc\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.026892 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.031375 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.043942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lbkz\" (UniqueName: \"kubernetes.io/projected/5402d84d-ce2a-4a49-810e-600f6bfa5fef-kube-api-access-9lbkz\") pod \"dnsmasq-dns-757b4f8459-ckjgd\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.079227 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.131483 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.131610 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-kube-api-access-ch5bd\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.131972 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.138060 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.140501 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.154345 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-kube-api-access-ch5bd\") pod \"nova-cell1-novncproxy-0\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.163279 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.254983 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.366909 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.380033 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-bdd2r"] Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.559720 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5bh6t"] Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.561943 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.596857 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.597045 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.633130 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5bh6t"] Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.641693 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:50 crc kubenswrapper[4691]: W1202 08:06:50.649616 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddae46655_0230_41ca_a576_10924967589b.slice/crio-4328854e093b97f869b5596a1f0f588f59b190223dd517c779ce46e0858d3c30 WatchSource:0}: Error finding container 4328854e093b97f869b5596a1f0f588f59b190223dd517c779ce46e0858d3c30: Status 404 returned error can't find the container with id 4328854e093b97f869b5596a1f0f588f59b190223dd517c779ce46e0858d3c30 Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.653347 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-scripts\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.653425 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-config-data\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.653447 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rl2r\" (UniqueName: \"kubernetes.io/projected/3963ab52-3d58-4201-baa1-6743421bdca3-kube-api-access-5rl2r\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.653488 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.755967 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-scripts\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.756376 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-config-data\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.756397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rl2r\" (UniqueName: \"kubernetes.io/projected/3963ab52-3d58-4201-baa1-6743421bdca3-kube-api-access-5rl2r\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.756453 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.762647 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-scripts\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.763511 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.767644 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-config-data\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.782026 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rl2r\" (UniqueName: \"kubernetes.io/projected/3963ab52-3d58-4201-baa1-6743421bdca3-kube-api-access-5rl2r\") pod \"nova-cell1-conductor-db-sync-5bh6t\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.907118 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:06:50 crc kubenswrapper[4691]: W1202 08:06:50.910534 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac0c01ba_70df_4e67_9582_2984e9fb292c.slice/crio-1c1264cdd2082088dceee9577f3b9e1658642181e6c4ec9a3af203b3ee827fbf WatchSource:0}: Error finding container 1c1264cdd2082088dceee9577f3b9e1658642181e6c4ec9a3af203b3ee827fbf: Status 404 returned error can't find the container with id 1c1264cdd2082088dceee9577f3b9e1658642181e6c4ec9a3af203b3ee827fbf Dec 02 08:06:50 crc kubenswrapper[4691]: I1202 08:06:50.938015 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.069807 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckjgd"] Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.081447 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.288530 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86f7acc1-f8b0-4333-8f26-4f003a37e4d9","Type":"ContainerStarted","Data":"d947862d79bba2a4ac766e7238f991eb446aee970bbe00fb21dbc46f6a01ae3d"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.302550 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac0c01ba-70df-4e67-9582-2984e9fb292c","Type":"ContainerStarted","Data":"1c1264cdd2082088dceee9577f3b9e1658642181e6c4ec9a3af203b3ee827fbf"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.308834 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bee234f2-733f-4011-8cbc-baa61d2323c0","Type":"ContainerStarted","Data":"fa8f32604933cb47f555bb027347b2ffde50333f29421ae18d4b5d3a354ffad8"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.310611 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" event={"ID":"5402d84d-ce2a-4a49-810e-600f6bfa5fef","Type":"ContainerStarted","Data":"14a01ef08bdcc213776ad509536df0d4abc6e656f0620353e8cac27a53e7afc9"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.313057 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bdd2r" event={"ID":"90e610cf-6ebe-4904-afe8-749d466fa6eb","Type":"ContainerStarted","Data":"c8ee537f19bcb052d12c87824157b6652a90e3b8b567d5e70a320cd6f372eb98"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.313079 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bdd2r" event={"ID":"90e610cf-6ebe-4904-afe8-749d466fa6eb","Type":"ContainerStarted","Data":"d2db50ccf598479061881b5b7daef041686aad2214725cf3b1e77fc276ff022d"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.326666 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dae46655-0230-41ca-a576-10924967589b","Type":"ContainerStarted","Data":"4328854e093b97f869b5596a1f0f588f59b190223dd517c779ce46e0858d3c30"} Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.334150 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-bdd2r" podStartSLOduration=3.334132375 podStartE2EDuration="3.334132375s" podCreationTimestamp="2025-12-02 08:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:51.331805427 +0000 UTC m=+1259.115884289" watchObservedRunningTime="2025-12-02 08:06:51.334132375 +0000 UTC m=+1259.118211237" Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.482809 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5bh6t"] Dec 02 08:06:51 crc kubenswrapper[4691]: W1202 08:06:51.489694 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3963ab52_3d58_4201_baa1_6743421bdca3.slice/crio-f69d2fdb75712e7b8f05f4e0f3622f604ad6fad1e8be83acc5d668b2f1c92ef5 WatchSource:0}: Error finding container f69d2fdb75712e7b8f05f4e0f3622f604ad6fad1e8be83acc5d668b2f1c92ef5: Status 404 returned error can't find the container with id f69d2fdb75712e7b8f05f4e0f3622f604ad6fad1e8be83acc5d668b2f1c92ef5 Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.908096 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.908438 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.908493 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.909446 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c6622064f9b8ca4e8932f776c16ae5af9973dd396f94dcf631a7aa1f00aa037"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:06:51 crc kubenswrapper[4691]: I1202 08:06:51.909501 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://3c6622064f9b8ca4e8932f776c16ae5af9973dd396f94dcf631a7aa1f00aa037" gracePeriod=600 Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.408443 4691 generic.go:334] "Generic (PLEG): container finished" podID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerID="f3535ec4678743b3963ae2dbe40c65668525394f80ae452be7078d0397d5da0e" exitCode=0 Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.408829 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" event={"ID":"5402d84d-ce2a-4a49-810e-600f6bfa5fef","Type":"ContainerDied","Data":"f3535ec4678743b3963ae2dbe40c65668525394f80ae452be7078d0397d5da0e"} Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.442163 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="3c6622064f9b8ca4e8932f776c16ae5af9973dd396f94dcf631a7aa1f00aa037" exitCode=0 Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.442256 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"3c6622064f9b8ca4e8932f776c16ae5af9973dd396f94dcf631a7aa1f00aa037"} Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.442293 4691 scope.go:117] "RemoveContainer" containerID="29b37d8d63090a5b29435fd6f341a26e6433431bf7160b686913291b1dd9efc2" Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.460233 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" event={"ID":"3963ab52-3d58-4201-baa1-6743421bdca3","Type":"ContainerStarted","Data":"b43372f37196cc7c10c33b8f309e2ac4d0c3cbb7f0fee43448aecd1ce4547a0d"} Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.460277 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" event={"ID":"3963ab52-3d58-4201-baa1-6743421bdca3","Type":"ContainerStarted","Data":"f69d2fdb75712e7b8f05f4e0f3622f604ad6fad1e8be83acc5d668b2f1c92ef5"} Dec 02 08:06:52 crc kubenswrapper[4691]: I1202 08:06:52.530517 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" podStartSLOduration=2.530494012 podStartE2EDuration="2.530494012s" podCreationTimestamp="2025-12-02 08:06:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:52.508516454 +0000 UTC m=+1260.292595316" watchObservedRunningTime="2025-12-02 08:06:52.530494012 +0000 UTC m=+1260.314572874" Dec 02 08:06:53 crc kubenswrapper[4691]: I1202 08:06:53.476412 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:53 crc kubenswrapper[4691]: I1202 08:06:53.512745 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:06:54 crc kubenswrapper[4691]: I1202 08:06:54.330152 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 08:06:55 crc kubenswrapper[4691]: I1202 08:06:55.510298 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"35bf2b176e04ba95989431e7a1c5a8ad045d68a6d864e710ee0e03b73b56f536"} Dec 02 08:06:55 crc kubenswrapper[4691]: I1202 08:06:55.515668 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86f7acc1-f8b0-4333-8f26-4f003a37e4d9","Type":"ContainerStarted","Data":"f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100"} Dec 02 08:06:55 crc kubenswrapper[4691]: I1202 08:06:55.518485 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" event={"ID":"5402d84d-ce2a-4a49-810e-600f6bfa5fef","Type":"ContainerStarted","Data":"e3ffb8fee44d8f92819e5d3d4c4aa81a51986252f008b7e59d26cdcdf348fa0c"} Dec 02 08:06:55 crc kubenswrapper[4691]: I1202 08:06:55.519716 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bee234f2-733f-4011-8cbc-baa61d2323c0","Type":"ContainerStarted","Data":"387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601"} Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.530293 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac0c01ba-70df-4e67-9582-2984e9fb292c","Type":"ContainerStarted","Data":"bea1158005c67841b08e91550497b841dd032e2ce897feb145e95426d02dbe30"} Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.532469 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bee234f2-733f-4011-8cbc-baa61d2323c0","Type":"ContainerStarted","Data":"728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f"} Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.535086 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-log" containerID="cri-o://4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487" gracePeriod=30 Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.535396 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dae46655-0230-41ca-a576-10924967589b","Type":"ContainerStarted","Data":"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649"} Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.535429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dae46655-0230-41ca-a576-10924967589b","Type":"ContainerStarted","Data":"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487"} Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.535448 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.539155 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-metadata" containerID="cri-o://cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649" gracePeriod=30 Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.539313 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="86f7acc1-f8b0-4333-8f26-4f003a37e4d9" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100" gracePeriod=30 Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.582839 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.8576352 podStartE2EDuration="7.582818211s" podCreationTimestamp="2025-12-02 08:06:49 +0000 UTC" firstStartedPulling="2025-12-02 08:06:50.913662108 +0000 UTC m=+1258.697740970" lastFinishedPulling="2025-12-02 08:06:54.638845119 +0000 UTC m=+1262.422923981" observedRunningTime="2025-12-02 08:06:56.56514614 +0000 UTC m=+1264.349225002" watchObservedRunningTime="2025-12-02 08:06:56.582818211 +0000 UTC m=+1264.366897073" Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.625702 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.061242822 podStartE2EDuration="7.625681231s" podCreationTimestamp="2025-12-02 08:06:49 +0000 UTC" firstStartedPulling="2025-12-02 08:06:51.131059185 +0000 UTC m=+1258.915138047" lastFinishedPulling="2025-12-02 08:06:54.695497594 +0000 UTC m=+1262.479576456" observedRunningTime="2025-12-02 08:06:56.581347445 +0000 UTC m=+1264.365426317" watchObservedRunningTime="2025-12-02 08:06:56.625681231 +0000 UTC m=+1264.409760093" Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.633211 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.635674008 podStartE2EDuration="7.633069346s" podCreationTimestamp="2025-12-02 08:06:49 +0000 UTC" firstStartedPulling="2025-12-02 08:06:50.656258421 +0000 UTC m=+1258.440337283" lastFinishedPulling="2025-12-02 08:06:54.653653759 +0000 UTC m=+1262.437732621" observedRunningTime="2025-12-02 08:06:56.602221426 +0000 UTC m=+1264.386300318" watchObservedRunningTime="2025-12-02 08:06:56.633069346 +0000 UTC m=+1264.417148208" Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.697285 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.530004529 podStartE2EDuration="7.697223527s" podCreationTimestamp="2025-12-02 08:06:49 +0000 UTC" firstStartedPulling="2025-12-02 08:06:50.450411852 +0000 UTC m=+1258.234490714" lastFinishedPulling="2025-12-02 08:06:54.61763084 +0000 UTC m=+1262.401709712" observedRunningTime="2025-12-02 08:06:56.647282971 +0000 UTC m=+1264.431361823" watchObservedRunningTime="2025-12-02 08:06:56.697223527 +0000 UTC m=+1264.481302389" Dec 02 08:06:56 crc kubenswrapper[4691]: I1202 08:06:56.699598 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" podStartSLOduration=7.699583626 podStartE2EDuration="7.699583626s" podCreationTimestamp="2025-12-02 08:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:56.675493155 +0000 UTC m=+1264.459572037" watchObservedRunningTime="2025-12-02 08:06:56.699583626 +0000 UTC m=+1264.483662498" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.227881 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.334785 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-config-data\") pod \"dae46655-0230-41ca-a576-10924967589b\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.334975 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-combined-ca-bundle\") pod \"dae46655-0230-41ca-a576-10924967589b\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.335008 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bxt2\" (UniqueName: \"kubernetes.io/projected/dae46655-0230-41ca-a576-10924967589b-kube-api-access-8bxt2\") pod \"dae46655-0230-41ca-a576-10924967589b\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.335073 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae46655-0230-41ca-a576-10924967589b-logs\") pod \"dae46655-0230-41ca-a576-10924967589b\" (UID: \"dae46655-0230-41ca-a576-10924967589b\") " Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.336086 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae46655-0230-41ca-a576-10924967589b-logs" (OuterVolumeSpecName: "logs") pod "dae46655-0230-41ca-a576-10924967589b" (UID: "dae46655-0230-41ca-a576-10924967589b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.356325 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae46655-0230-41ca-a576-10924967589b-kube-api-access-8bxt2" (OuterVolumeSpecName: "kube-api-access-8bxt2") pod "dae46655-0230-41ca-a576-10924967589b" (UID: "dae46655-0230-41ca-a576-10924967589b"). InnerVolumeSpecName "kube-api-access-8bxt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.381454 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-config-data" (OuterVolumeSpecName: "config-data") pod "dae46655-0230-41ca-a576-10924967589b" (UID: "dae46655-0230-41ca-a576-10924967589b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.393027 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dae46655-0230-41ca-a576-10924967589b" (UID: "dae46655-0230-41ca-a576-10924967589b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.437913 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.438260 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bxt2\" (UniqueName: \"kubernetes.io/projected/dae46655-0230-41ca-a576-10924967589b-kube-api-access-8bxt2\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.438355 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dae46655-0230-41ca-a576-10924967589b-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.438426 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dae46655-0230-41ca-a576-10924967589b-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552073 4691 generic.go:334] "Generic (PLEG): container finished" podID="dae46655-0230-41ca-a576-10924967589b" containerID="cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649" exitCode=0 Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552107 4691 generic.go:334] "Generic (PLEG): container finished" podID="dae46655-0230-41ca-a576-10924967589b" containerID="4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487" exitCode=143 Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552145 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552231 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dae46655-0230-41ca-a576-10924967589b","Type":"ContainerDied","Data":"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649"} Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552324 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dae46655-0230-41ca-a576-10924967589b","Type":"ContainerDied","Data":"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487"} Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552343 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"dae46655-0230-41ca-a576-10924967589b","Type":"ContainerDied","Data":"4328854e093b97f869b5596a1f0f588f59b190223dd517c779ce46e0858d3c30"} Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.552364 4691 scope.go:117] "RemoveContainer" containerID="cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.593967 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.596735 4691 scope.go:117] "RemoveContainer" containerID="4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.615084 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.625240 4691 scope.go:117] "RemoveContainer" containerID="cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649" Dec 02 08:06:57 crc kubenswrapper[4691]: E1202 08:06:57.626181 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649\": container with ID starting with cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649 not found: ID does not exist" containerID="cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.626234 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649"} err="failed to get container status \"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649\": rpc error: code = NotFound desc = could not find container \"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649\": container with ID starting with cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649 not found: ID does not exist" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.626263 4691 scope.go:117] "RemoveContainer" containerID="4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487" Dec 02 08:06:57 crc kubenswrapper[4691]: E1202 08:06:57.626613 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487\": container with ID starting with 4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487 not found: ID does not exist" containerID="4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.626650 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487"} err="failed to get container status \"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487\": rpc error: code = NotFound desc = could not find container \"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487\": container with ID starting with 4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487 not found: ID does not exist" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.626667 4691 scope.go:117] "RemoveContainer" containerID="cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.626941 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649"} err="failed to get container status \"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649\": rpc error: code = NotFound desc = could not find container \"cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649\": container with ID starting with cb3c97dd7376279f9ff45b799aac751cd2d537ce3382b3cf524afaff4cbeb649 not found: ID does not exist" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.626977 4691 scope.go:117] "RemoveContainer" containerID="4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.628147 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487"} err="failed to get container status \"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487\": rpc error: code = NotFound desc = could not find container \"4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487\": container with ID starting with 4d066201e7a98bb7e4d3e53dd089de70165e75421e553b1d002f0ee802309487 not found: ID does not exist" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.628358 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:57 crc kubenswrapper[4691]: E1202 08:06:57.628816 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-metadata" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.628835 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-metadata" Dec 02 08:06:57 crc kubenswrapper[4691]: E1202 08:06:57.628857 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-log" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.628866 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-log" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.629064 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-log" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.629086 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="dae46655-0230-41ca-a576-10924967589b" containerName="nova-metadata-metadata" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.630257 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.634307 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.634583 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.637352 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.762398 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7kpl\" (UniqueName: \"kubernetes.io/projected/e5b8f9b9-f3a9-410c-9969-595abdb3932a-kube-api-access-x7kpl\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.762755 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-config-data\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.762815 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5b8f9b9-f3a9-410c-9969-595abdb3932a-logs\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.762834 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.762856 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.864919 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7kpl\" (UniqueName: \"kubernetes.io/projected/e5b8f9b9-f3a9-410c-9969-595abdb3932a-kube-api-access-x7kpl\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.865066 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-config-data\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.865101 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5b8f9b9-f3a9-410c-9969-595abdb3932a-logs\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.865123 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.865158 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.866001 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5b8f9b9-f3a9-410c-9969-595abdb3932a-logs\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.948691 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.952131 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-config-data\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.952376 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7kpl\" (UniqueName: \"kubernetes.io/projected/e5b8f9b9-f3a9-410c-9969-595abdb3932a-kube-api-access-x7kpl\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.953304 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " pod="openstack/nova-metadata-0" Dec 02 08:06:57 crc kubenswrapper[4691]: I1202 08:06:57.968232 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:06:58 crc kubenswrapper[4691]: I1202 08:06:58.472853 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:06:58 crc kubenswrapper[4691]: W1202 08:06:58.478378 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5b8f9b9_f3a9_410c_9969_595abdb3932a.slice/crio-d67d15b7146aed307f206336d90601c9d1ca76bbd695455ef60a97c10facf84d WatchSource:0}: Error finding container d67d15b7146aed307f206336d90601c9d1ca76bbd695455ef60a97c10facf84d: Status 404 returned error can't find the container with id d67d15b7146aed307f206336d90601c9d1ca76bbd695455ef60a97c10facf84d Dec 02 08:06:58 crc kubenswrapper[4691]: I1202 08:06:58.574929 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae46655-0230-41ca-a576-10924967589b" path="/var/lib/kubelet/pods/dae46655-0230-41ca-a576-10924967589b/volumes" Dec 02 08:06:58 crc kubenswrapper[4691]: I1202 08:06:58.580978 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5b8f9b9-f3a9-410c-9969-595abdb3932a","Type":"ContainerStarted","Data":"d67d15b7146aed307f206336d90601c9d1ca76bbd695455ef60a97c10facf84d"} Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.159864 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.160103 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="1473d859-fd02-490b-a906-bf8136cb422c" containerName="kube-state-metrics" containerID="cri-o://c1185d2699dd44b7387024287e734dcd624c4b4e4985e16c8bdafa6f889983b9" gracePeriod=30 Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.609063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5b8f9b9-f3a9-410c-9969-595abdb3932a","Type":"ContainerStarted","Data":"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d"} Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.609426 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5b8f9b9-f3a9-410c-9969-595abdb3932a","Type":"ContainerStarted","Data":"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355"} Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.613081 4691 generic.go:334] "Generic (PLEG): container finished" podID="1473d859-fd02-490b-a906-bf8136cb422c" containerID="c1185d2699dd44b7387024287e734dcd624c4b4e4985e16c8bdafa6f889983b9" exitCode=2 Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.613141 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1473d859-fd02-490b-a906-bf8136cb422c","Type":"ContainerDied","Data":"c1185d2699dd44b7387024287e734dcd624c4b4e4985e16c8bdafa6f889983b9"} Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.637317 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.637292398 podStartE2EDuration="2.637292398s" podCreationTimestamp="2025-12-02 08:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:06:59.62934924 +0000 UTC m=+1267.413428112" watchObservedRunningTime="2025-12-02 08:06:59.637292398 +0000 UTC m=+1267.421371270" Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.741583 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.741626 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.743436 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.881521 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdsst\" (UniqueName: \"kubernetes.io/projected/1473d859-fd02-490b-a906-bf8136cb422c-kube-api-access-hdsst\") pod \"1473d859-fd02-490b-a906-bf8136cb422c\" (UID: \"1473d859-fd02-490b-a906-bf8136cb422c\") " Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.887813 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1473d859-fd02-490b-a906-bf8136cb422c-kube-api-access-hdsst" (OuterVolumeSpecName: "kube-api-access-hdsst") pod "1473d859-fd02-490b-a906-bf8136cb422c" (UID: "1473d859-fd02-490b-a906-bf8136cb422c"). InnerVolumeSpecName "kube-api-access-hdsst". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:06:59 crc kubenswrapper[4691]: I1202 08:06:59.985001 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdsst\" (UniqueName: \"kubernetes.io/projected/1473d859-fd02-490b-a906-bf8136cb422c-kube-api-access-hdsst\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.080523 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.080583 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.116329 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.164778 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.239524 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v65fc"] Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.239849 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerName="dnsmasq-dns" containerID="cri-o://7041b9f35263279ee82e63a63843ff21479f24fc00cc9f5399f091c0bc88c62c" gracePeriod=10 Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.255813 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.628606 4691 generic.go:334] "Generic (PLEG): container finished" podID="90e610cf-6ebe-4904-afe8-749d466fa6eb" containerID="c8ee537f19bcb052d12c87824157b6652a90e3b8b567d5e70a320cd6f372eb98" exitCode=0 Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.628891 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bdd2r" event={"ID":"90e610cf-6ebe-4904-afe8-749d466fa6eb","Type":"ContainerDied","Data":"c8ee537f19bcb052d12c87824157b6652a90e3b8b567d5e70a320cd6f372eb98"} Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.644798 4691 generic.go:334] "Generic (PLEG): container finished" podID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerID="7041b9f35263279ee82e63a63843ff21479f24fc00cc9f5399f091c0bc88c62c" exitCode=0 Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.644896 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" event={"ID":"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3","Type":"ContainerDied","Data":"7041b9f35263279ee82e63a63843ff21479f24fc00cc9f5399f091c0bc88c62c"} Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.654775 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"1473d859-fd02-490b-a906-bf8136cb422c","Type":"ContainerDied","Data":"3b7b0d2b4a5af31b3ddc277a56220386a0018ba3fde17bcb9b6b2c124c96240d"} Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.654858 4691 scope.go:117] "RemoveContainer" containerID="c1185d2699dd44b7387024287e734dcd624c4b4e4985e16c8bdafa6f889983b9" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.655292 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.731356 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.755835 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.770928 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.794846 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:07:00 crc kubenswrapper[4691]: E1202 08:07:00.795485 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1473d859-fd02-490b-a906-bf8136cb422c" containerName="kube-state-metrics" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.795510 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1473d859-fd02-490b-a906-bf8136cb422c" containerName="kube-state-metrics" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.795811 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1473d859-fd02-490b-a906-bf8136cb422c" containerName="kube-state-metrics" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.796779 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.799574 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.799861 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.809438 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.823941 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.824233 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.183:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.897387 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.916478 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbsh\" (UniqueName: \"kubernetes.io/projected/6960ef77-d277-4be6-be89-446664dd7775-kube-api-access-tbbsh\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.916589 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.916684 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:00 crc kubenswrapper[4691]: I1202 08:07:00.916711 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.017940 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-swift-storage-0\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.017993 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-nb\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018045 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018126 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-config\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018179 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v6l7\" (UniqueName: \"kubernetes.io/projected/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-kube-api-access-6v6l7\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018389 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-sb\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018887 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.018907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.019019 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbsh\" (UniqueName: \"kubernetes.io/projected/6960ef77-d277-4be6-be89-446664dd7775-kube-api-access-tbbsh\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.037366 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.037473 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.040737 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbsh\" (UniqueName: \"kubernetes.io/projected/6960ef77-d277-4be6-be89-446664dd7775-kube-api-access-tbbsh\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.040730 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-kube-api-access-6v6l7" (OuterVolumeSpecName: "kube-api-access-6v6l7") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3"). InnerVolumeSpecName "kube-api-access-6v6l7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.043345 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6960ef77-d277-4be6-be89-446664dd7775-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6960ef77-d277-4be6-be89-446664dd7775\") " pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.117451 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.120882 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v6l7\" (UniqueName: \"kubernetes.io/projected/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-kube-api-access-6v6l7\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.120917 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.134842 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.139618 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-config" (OuterVolumeSpecName: "config") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:01 crc kubenswrapper[4691]: E1202 08:07:01.171102 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc podName:a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3 nodeName:}" failed. No retries permitted until 2025-12-02 08:07:01.67104478 +0000 UTC m=+1269.455123642 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3") : error deleting /var/lib/kubelet/pods/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3/volume-subpaths: remove /var/lib/kubelet/pods/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3/volume-subpaths: no such file or directory Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.171473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.193047 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.224036 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.224075 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.224088 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.667850 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.671410 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-v65fc" event={"ID":"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3","Type":"ContainerDied","Data":"f9c7d80f8071a1a9b476478d0ba7bd73928db2f4c9ffe6f55a82918e3334b507"} Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.671461 4691 scope.go:117] "RemoveContainer" containerID="7041b9f35263279ee82e63a63843ff21479f24fc00cc9f5399f091c0bc88c62c" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.720148 4691 scope.go:117] "RemoveContainer" containerID="995361b04d06c767d40cd78fcd5e258a55f7f5f1ecba12fb8cb1e797bc072476" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.745450 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.751413 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc\") pod \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\" (UID: \"a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3\") " Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.757432 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" (UID: "a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:01 crc kubenswrapper[4691]: I1202 08:07:01.854910 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.150566 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.182182 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v65fc"] Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.225177 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-v65fc"] Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.264787 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-combined-ca-bundle\") pod \"90e610cf-6ebe-4904-afe8-749d466fa6eb\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.264931 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm5cj\" (UniqueName: \"kubernetes.io/projected/90e610cf-6ebe-4904-afe8-749d466fa6eb-kube-api-access-sm5cj\") pod \"90e610cf-6ebe-4904-afe8-749d466fa6eb\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.265169 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-config-data\") pod \"90e610cf-6ebe-4904-afe8-749d466fa6eb\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.265261 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-scripts\") pod \"90e610cf-6ebe-4904-afe8-749d466fa6eb\" (UID: \"90e610cf-6ebe-4904-afe8-749d466fa6eb\") " Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.286042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-scripts" (OuterVolumeSpecName: "scripts") pod "90e610cf-6ebe-4904-afe8-749d466fa6eb" (UID: "90e610cf-6ebe-4904-afe8-749d466fa6eb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.293878 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e610cf-6ebe-4904-afe8-749d466fa6eb-kube-api-access-sm5cj" (OuterVolumeSpecName: "kube-api-access-sm5cj") pod "90e610cf-6ebe-4904-afe8-749d466fa6eb" (UID: "90e610cf-6ebe-4904-afe8-749d466fa6eb"). InnerVolumeSpecName "kube-api-access-sm5cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.307299 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-config-data" (OuterVolumeSpecName: "config-data") pod "90e610cf-6ebe-4904-afe8-749d466fa6eb" (UID: "90e610cf-6ebe-4904-afe8-749d466fa6eb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.315029 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90e610cf-6ebe-4904-afe8-749d466fa6eb" (UID: "90e610cf-6ebe-4904-afe8-749d466fa6eb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.368104 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.368154 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm5cj\" (UniqueName: \"kubernetes.io/projected/90e610cf-6ebe-4904-afe8-749d466fa6eb-kube-api-access-sm5cj\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.368168 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.368178 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90e610cf-6ebe-4904-afe8-749d466fa6eb-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.459740 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.460883 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="sg-core" containerID="cri-o://07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.461043 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="proxy-httpd" containerID="cri-o://154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.461128 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-notification-agent" containerID="cri-o://09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.461202 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-central-agent" containerID="cri-o://aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.576668 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1473d859-fd02-490b-a906-bf8136cb422c" path="/var/lib/kubelet/pods/1473d859-fd02-490b-a906-bf8136cb422c/volumes" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.577431 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" path="/var/lib/kubelet/pods/a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3/volumes" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.690531 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6960ef77-d277-4be6-be89-446664dd7775","Type":"ContainerStarted","Data":"89a7aa01ec854a654cfa0807341cc11a8815a64f6e5073aaaa0fb83d07d73000"} Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.691000 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6960ef77-d277-4be6-be89-446664dd7775","Type":"ContainerStarted","Data":"ee240e8df176f8bb79d534818053900c8cecb972398b72b022e09f4ba5746608"} Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.691016 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.698162 4691 generic.go:334] "Generic (PLEG): container finished" podID="133993ed-c8ce-4a04-8646-defeb53b9904" containerID="07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80" exitCode=2 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.698228 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerDied","Data":"07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80"} Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.704325 4691 generic.go:334] "Generic (PLEG): container finished" podID="3963ab52-3d58-4201-baa1-6743421bdca3" containerID="b43372f37196cc7c10c33b8f309e2ac4d0c3cbb7f0fee43448aecd1ce4547a0d" exitCode=0 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.704459 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" event={"ID":"3963ab52-3d58-4201-baa1-6743421bdca3","Type":"ContainerDied","Data":"b43372f37196cc7c10c33b8f309e2ac4d0c3cbb7f0fee43448aecd1ce4547a0d"} Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.713638 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.314306562 podStartE2EDuration="2.713623541s" podCreationTimestamp="2025-12-02 08:07:00 +0000 UTC" firstStartedPulling="2025-12-02 08:07:01.773116431 +0000 UTC m=+1269.557195293" lastFinishedPulling="2025-12-02 08:07:02.17243341 +0000 UTC m=+1269.956512272" observedRunningTime="2025-12-02 08:07:02.71038307 +0000 UTC m=+1270.494461962" watchObservedRunningTime="2025-12-02 08:07:02.713623541 +0000 UTC m=+1270.497702403" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.713982 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-bdd2r" event={"ID":"90e610cf-6ebe-4904-afe8-749d466fa6eb","Type":"ContainerDied","Data":"d2db50ccf598479061881b5b7daef041686aad2214725cf3b1e77fc276ff022d"} Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.714024 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2db50ccf598479061881b5b7daef041686aad2214725cf3b1e77fc276ff022d" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.714083 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-bdd2r" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.857992 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.858345 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-log" containerID="cri-o://387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.858574 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-api" containerID="cri-o://728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.875981 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.876271 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ac0c01ba-70df-4e67-9582-2984e9fb292c" containerName="nova-scheduler-scheduler" containerID="cri-o://bea1158005c67841b08e91550497b841dd032e2ce897feb145e95426d02dbe30" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.906917 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.907283 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-log" containerID="cri-o://65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.907383 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-metadata" containerID="cri-o://5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d" gracePeriod=30 Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.969298 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:07:02 crc kubenswrapper[4691]: I1202 08:07:02.969369 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.579086 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.699194 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7kpl\" (UniqueName: \"kubernetes.io/projected/e5b8f9b9-f3a9-410c-9969-595abdb3932a-kube-api-access-x7kpl\") pod \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.699283 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5b8f9b9-f3a9-410c-9969-595abdb3932a-logs\") pod \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.699340 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-nova-metadata-tls-certs\") pod \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.699420 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-combined-ca-bundle\") pod \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.699487 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-config-data\") pod \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\" (UID: \"e5b8f9b9-f3a9-410c-9969-595abdb3932a\") " Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.701853 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5b8f9b9-f3a9-410c-9969-595abdb3932a-logs" (OuterVolumeSpecName: "logs") pod "e5b8f9b9-f3a9-410c-9969-595abdb3932a" (UID: "e5b8f9b9-f3a9-410c-9969-595abdb3932a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.712902 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5b8f9b9-f3a9-410c-9969-595abdb3932a-kube-api-access-x7kpl" (OuterVolumeSpecName: "kube-api-access-x7kpl") pod "e5b8f9b9-f3a9-410c-9969-595abdb3932a" (UID: "e5b8f9b9-f3a9-410c-9969-595abdb3932a"). InnerVolumeSpecName "kube-api-access-x7kpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.735232 4691 generic.go:334] "Generic (PLEG): container finished" podID="ac0c01ba-70df-4e67-9582-2984e9fb292c" containerID="bea1158005c67841b08e91550497b841dd032e2ce897feb145e95426d02dbe30" exitCode=0 Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.735288 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac0c01ba-70df-4e67-9582-2984e9fb292c","Type":"ContainerDied","Data":"bea1158005c67841b08e91550497b841dd032e2ce897feb145e95426d02dbe30"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.747163 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5b8f9b9-f3a9-410c-9969-595abdb3932a" (UID: "e5b8f9b9-f3a9-410c-9969-595abdb3932a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.750927 4691 generic.go:334] "Generic (PLEG): container finished" podID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerID="387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601" exitCode=143 Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.751039 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bee234f2-733f-4011-8cbc-baa61d2323c0","Type":"ContainerDied","Data":"387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.755499 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-config-data" (OuterVolumeSpecName: "config-data") pod "e5b8f9b9-f3a9-410c-9969-595abdb3932a" (UID: "e5b8f9b9-f3a9-410c-9969-595abdb3932a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.761804 4691 generic.go:334] "Generic (PLEG): container finished" podID="133993ed-c8ce-4a04-8646-defeb53b9904" containerID="154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2" exitCode=0 Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.761841 4691 generic.go:334] "Generic (PLEG): container finished" podID="133993ed-c8ce-4a04-8646-defeb53b9904" containerID="aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6" exitCode=0 Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.761893 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerDied","Data":"154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.761928 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerDied","Data":"aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.764300 4691 generic.go:334] "Generic (PLEG): container finished" podID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerID="5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d" exitCode=0 Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.764321 4691 generic.go:334] "Generic (PLEG): container finished" podID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerID="65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355" exitCode=143 Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.764559 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.765221 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5b8f9b9-f3a9-410c-9969-595abdb3932a","Type":"ContainerDied","Data":"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.765272 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5b8f9b9-f3a9-410c-9969-595abdb3932a","Type":"ContainerDied","Data":"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.765283 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e5b8f9b9-f3a9-410c-9969-595abdb3932a","Type":"ContainerDied","Data":"d67d15b7146aed307f206336d90601c9d1ca76bbd695455ef60a97c10facf84d"} Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.765299 4691 scope.go:117] "RemoveContainer" containerID="5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.766630 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e5b8f9b9-f3a9-410c-9969-595abdb3932a" (UID: "e5b8f9b9-f3a9-410c-9969-595abdb3932a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.802219 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.802270 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7kpl\" (UniqueName: \"kubernetes.io/projected/e5b8f9b9-f3a9-410c-9969-595abdb3932a-kube-api-access-x7kpl\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.802287 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5b8f9b9-f3a9-410c-9969-595abdb3932a-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.802302 4691 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.802315 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5b8f9b9-f3a9-410c-9969-595abdb3932a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.889799 4691 scope.go:117] "RemoveContainer" containerID="65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.921930 4691 scope.go:117] "RemoveContainer" containerID="5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d" Dec 02 08:07:03 crc kubenswrapper[4691]: E1202 08:07:03.925916 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d\": container with ID starting with 5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d not found: ID does not exist" containerID="5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.925971 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d"} err="failed to get container status \"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d\": rpc error: code = NotFound desc = could not find container \"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d\": container with ID starting with 5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d not found: ID does not exist" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.926000 4691 scope.go:117] "RemoveContainer" containerID="65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355" Dec 02 08:07:03 crc kubenswrapper[4691]: E1202 08:07:03.926442 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355\": container with ID starting with 65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355 not found: ID does not exist" containerID="65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.926486 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355"} err="failed to get container status \"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355\": rpc error: code = NotFound desc = could not find container \"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355\": container with ID starting with 65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355 not found: ID does not exist" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.926518 4691 scope.go:117] "RemoveContainer" containerID="5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.927075 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d"} err="failed to get container status \"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d\": rpc error: code = NotFound desc = could not find container \"5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d\": container with ID starting with 5cfeef44e648a59e66b46b26e6388b19ba0b3dee0308033bace904dc50a3fa5d not found: ID does not exist" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.927097 4691 scope.go:117] "RemoveContainer" containerID="65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355" Dec 02 08:07:03 crc kubenswrapper[4691]: I1202 08:07:03.927527 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355"} err="failed to get container status \"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355\": rpc error: code = NotFound desc = could not find container \"65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355\": container with ID starting with 65885f670c447a0da7baa002feb579ce77ad04ecdc52a84c1388180650210355 not found: ID does not exist" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.117962 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.131878 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.150267 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.151043 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-metadata" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151074 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-metadata" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.151102 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerName="dnsmasq-dns" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151114 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerName="dnsmasq-dns" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.151135 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e610cf-6ebe-4904-afe8-749d466fa6eb" containerName="nova-manage" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151148 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e610cf-6ebe-4904-afe8-749d466fa6eb" containerName="nova-manage" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.151168 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerName="init" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151176 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerName="init" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.151191 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-log" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151201 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-log" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151462 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5f96bed-da77-4e8e-9fc7-a6f2ceefb5b3" containerName="dnsmasq-dns" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151492 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-metadata" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151512 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e610cf-6ebe-4904-afe8-749d466fa6eb" containerName="nova-manage" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.151538 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" containerName="nova-metadata-log" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.154669 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.157097 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.157533 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.161195 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.210282 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.296570 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.304668 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5b8f9b9_f3a9_410c_9969_595abdb3932a.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod133993ed_c8ce_4a04_8646_defeb53b9904.slice/crio-conmon-09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.321941 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-combined-ca-bundle\") pod \"ac0c01ba-70df-4e67-9582-2984e9fb292c\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.322107 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jczt8\" (UniqueName: \"kubernetes.io/projected/ac0c01ba-70df-4e67-9582-2984e9fb292c-kube-api-access-jczt8\") pod \"ac0c01ba-70df-4e67-9582-2984e9fb292c\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.322251 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-config-data\") pod \"ac0c01ba-70df-4e67-9582-2984e9fb292c\" (UID: \"ac0c01ba-70df-4e67-9582-2984e9fb292c\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.322578 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hd78\" (UniqueName: \"kubernetes.io/projected/d8025b3f-5cba-44b0-9e3e-b965f93104cc-kube-api-access-4hd78\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.322625 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8025b3f-5cba-44b0-9e3e-b965f93104cc-logs\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.322683 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.322786 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-config-data\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.323002 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.335059 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac0c01ba-70df-4e67-9582-2984e9fb292c-kube-api-access-jczt8" (OuterVolumeSpecName: "kube-api-access-jczt8") pod "ac0c01ba-70df-4e67-9582-2984e9fb292c" (UID: "ac0c01ba-70df-4e67-9582-2984e9fb292c"). InnerVolumeSpecName "kube-api-access-jczt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.354836 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ac0c01ba-70df-4e67-9582-2984e9fb292c" (UID: "ac0c01ba-70df-4e67-9582-2984e9fb292c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.375967 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-config-data" (OuterVolumeSpecName: "config-data") pod "ac0c01ba-70df-4e67-9582-2984e9fb292c" (UID: "ac0c01ba-70df-4e67-9582-2984e9fb292c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.430714 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-combined-ca-bundle\") pod \"3963ab52-3d58-4201-baa1-6743421bdca3\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.431154 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rl2r\" (UniqueName: \"kubernetes.io/projected/3963ab52-3d58-4201-baa1-6743421bdca3-kube-api-access-5rl2r\") pod \"3963ab52-3d58-4201-baa1-6743421bdca3\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.431269 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-scripts\") pod \"3963ab52-3d58-4201-baa1-6743421bdca3\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.431365 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-config-data\") pod \"3963ab52-3d58-4201-baa1-6743421bdca3\" (UID: \"3963ab52-3d58-4201-baa1-6743421bdca3\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.431694 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.431790 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-config-data\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.431999 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.432088 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hd78\" (UniqueName: \"kubernetes.io/projected/d8025b3f-5cba-44b0-9e3e-b965f93104cc-kube-api-access-4hd78\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.432154 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8025b3f-5cba-44b0-9e3e-b965f93104cc-logs\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.432235 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.432249 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jczt8\" (UniqueName: \"kubernetes.io/projected/ac0c01ba-70df-4e67-9582-2984e9fb292c-kube-api-access-jczt8\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.432263 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac0c01ba-70df-4e67-9582-2984e9fb292c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.432930 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8025b3f-5cba-44b0-9e3e-b965f93104cc-logs\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.443545 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-scripts" (OuterVolumeSpecName: "scripts") pod "3963ab52-3d58-4201-baa1-6743421bdca3" (UID: "3963ab52-3d58-4201-baa1-6743421bdca3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.445302 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.451562 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3963ab52-3d58-4201-baa1-6743421bdca3-kube-api-access-5rl2r" (OuterVolumeSpecName: "kube-api-access-5rl2r") pod "3963ab52-3d58-4201-baa1-6743421bdca3" (UID: "3963ab52-3d58-4201-baa1-6743421bdca3"). InnerVolumeSpecName "kube-api-access-5rl2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.458563 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-config-data\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.459719 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.492752 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hd78\" (UniqueName: \"kubernetes.io/projected/d8025b3f-5cba-44b0-9e3e-b965f93104cc-kube-api-access-4hd78\") pod \"nova-metadata-0\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.509354 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-config-data" (OuterVolumeSpecName: "config-data") pod "3963ab52-3d58-4201-baa1-6743421bdca3" (UID: "3963ab52-3d58-4201-baa1-6743421bdca3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.534725 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.534787 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.534805 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rl2r\" (UniqueName: \"kubernetes.io/projected/3963ab52-3d58-4201-baa1-6743421bdca3-kube-api-access-5rl2r\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.538530 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.546036 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3963ab52-3d58-4201-baa1-6743421bdca3" (UID: "3963ab52-3d58-4201-baa1-6743421bdca3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.597548 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5b8f9b9-f3a9-410c-9969-595abdb3932a" path="/var/lib/kubelet/pods/e5b8f9b9-f3a9-410c-9969-595abdb3932a/volumes" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.636551 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3963ab52-3d58-4201-baa1-6743421bdca3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.705718 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.783719 4691 generic.go:334] "Generic (PLEG): container finished" podID="133993ed-c8ce-4a04-8646-defeb53b9904" containerID="09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623" exitCode=0 Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.783802 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerDied","Data":"09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623"} Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.783831 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"133993ed-c8ce-4a04-8646-defeb53b9904","Type":"ContainerDied","Data":"658b46612fd4bf193ade29cb33202b629494555d579f3c08ce16307aafeeb499"} Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.783849 4691 scope.go:117] "RemoveContainer" containerID="154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.783982 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.787744 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" event={"ID":"3963ab52-3d58-4201-baa1-6743421bdca3","Type":"ContainerDied","Data":"f69d2fdb75712e7b8f05f4e0f3622f604ad6fad1e8be83acc5d668b2f1c92ef5"} Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.787803 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f69d2fdb75712e7b8f05f4e0f3622f604ad6fad1e8be83acc5d668b2f1c92ef5" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.787882 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-5bh6t" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.798278 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ac0c01ba-70df-4e67-9582-2984e9fb292c","Type":"ContainerDied","Data":"1c1264cdd2082088dceee9577f3b9e1658642181e6c4ec9a3af203b3ee827fbf"} Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.798426 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.823498 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.824038 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac0c01ba-70df-4e67-9582-2984e9fb292c" containerName="nova-scheduler-scheduler" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824051 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac0c01ba-70df-4e67-9582-2984e9fb292c" containerName="nova-scheduler-scheduler" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.824067 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-notification-agent" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824075 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-notification-agent" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.824089 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-central-agent" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824095 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-central-agent" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.824134 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="proxy-httpd" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824141 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="proxy-httpd" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.824159 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3963ab52-3d58-4201-baa1-6743421bdca3" containerName="nova-cell1-conductor-db-sync" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824166 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3963ab52-3d58-4201-baa1-6743421bdca3" containerName="nova-cell1-conductor-db-sync" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.824182 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="sg-core" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824188 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="sg-core" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824415 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="sg-core" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824441 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="proxy-httpd" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824454 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3963ab52-3d58-4201-baa1-6743421bdca3" containerName="nova-cell1-conductor-db-sync" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824469 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-central-agent" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824481 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" containerName="ceilometer-notification-agent" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.824493 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac0c01ba-70df-4e67-9582-2984e9fb292c" containerName="nova-scheduler-scheduler" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.825292 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.826988 4691 scope.go:117] "RemoveContainer" containerID="07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.828685 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839379 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-log-httpd\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839474 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-combined-ca-bundle\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839603 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wg5r\" (UniqueName: \"kubernetes.io/projected/133993ed-c8ce-4a04-8646-defeb53b9904-kube-api-access-6wg5r\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839639 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-run-httpd\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839737 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-config-data\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839785 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-scripts\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.839921 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-sg-core-conf-yaml\") pod \"133993ed-c8ce-4a04-8646-defeb53b9904\" (UID: \"133993ed-c8ce-4a04-8646-defeb53b9904\") " Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.840048 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.840526 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.841204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.846168 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133993ed-c8ce-4a04-8646-defeb53b9904-kube-api-access-6wg5r" (OuterVolumeSpecName: "kube-api-access-6wg5r") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "kube-api-access-6wg5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.853349 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-scripts" (OuterVolumeSpecName: "scripts") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.865415 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.879582 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.882304 4691 scope.go:117] "RemoveContainer" containerID="09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.882943 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.903117 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.916321 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.918724 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.919107 4691 scope.go:117] "RemoveContainer" containerID="aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.922315 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.930240 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.942191 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flf6f\" (UniqueName: \"kubernetes.io/projected/e057af5a-bcd5-4612-9c03-350147146c52-kube-api-access-flf6f\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.942264 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e057af5a-bcd5-4612-9c03-350147146c52-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.942293 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e057af5a-bcd5-4612-9c03-350147146c52-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.946188 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wg5r\" (UniqueName: \"kubernetes.io/projected/133993ed-c8ce-4a04-8646-defeb53b9904-kube-api-access-6wg5r\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.946218 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/133993ed-c8ce-4a04-8646-defeb53b9904-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.946229 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.946242 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.953935 4691 scope.go:117] "RemoveContainer" containerID="154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.954521 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2\": container with ID starting with 154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2 not found: ID does not exist" containerID="154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.954576 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2"} err="failed to get container status \"154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2\": rpc error: code = NotFound desc = could not find container \"154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2\": container with ID starting with 154aa2f061e69fafd0916f2a264a7ed8e7e09262d166b678efb509eac18371c2 not found: ID does not exist" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.954612 4691 scope.go:117] "RemoveContainer" containerID="07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.955101 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80\": container with ID starting with 07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80 not found: ID does not exist" containerID="07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.955140 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80"} err="failed to get container status \"07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80\": rpc error: code = NotFound desc = could not find container \"07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80\": container with ID starting with 07bf242cea2009f8e92d2a357e52bf158dd79469855fa63be8bec87b58523a80 not found: ID does not exist" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.955168 4691 scope.go:117] "RemoveContainer" containerID="09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.955433 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623\": container with ID starting with 09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623 not found: ID does not exist" containerID="09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.955452 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623"} err="failed to get container status \"09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623\": rpc error: code = NotFound desc = could not find container \"09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623\": container with ID starting with 09745c070fe908220e316c47226232fb68faf78e402a467f6c0bf4d3205ee623 not found: ID does not exist" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.955463 4691 scope.go:117] "RemoveContainer" containerID="aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6" Dec 02 08:07:04 crc kubenswrapper[4691]: E1202 08:07:04.957168 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6\": container with ID starting with aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6 not found: ID does not exist" containerID="aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.957189 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6"} err="failed to get container status \"aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6\": rpc error: code = NotFound desc = could not find container \"aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6\": container with ID starting with aa3225a1388255de240d8f4823d9c37f3c3f9d92ff6f5364dcf97a8f042b75c6 not found: ID does not exist" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.957202 4691 scope.go:117] "RemoveContainer" containerID="bea1158005c67841b08e91550497b841dd032e2ce897feb145e95426d02dbe30" Dec 02 08:07:04 crc kubenswrapper[4691]: I1202 08:07:04.976678 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:04.994116 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-config-data" (OuterVolumeSpecName: "config-data") pod "133993ed-c8ce-4a04-8646-defeb53b9904" (UID: "133993ed-c8ce-4a04-8646-defeb53b9904"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.030080 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: W1202 08:07:05.031131 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8025b3f_5cba_44b0_9e3e_b965f93104cc.slice/crio-afbd63d2e3ab98c697619224a08cd9676885f83d185bc5da7dc432a1b34da707 WatchSource:0}: Error finding container afbd63d2e3ab98c697619224a08cd9676885f83d185bc5da7dc432a1b34da707: Status 404 returned error can't find the container with id afbd63d2e3ab98c697619224a08cd9676885f83d185bc5da7dc432a1b34da707 Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048000 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048083 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e057af5a-bcd5-4612-9c03-350147146c52-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048108 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e057af5a-bcd5-4612-9c03-350147146c52-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048236 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-config-data\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048307 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslrc\" (UniqueName: \"kubernetes.io/projected/3902ef70-0cb1-462e-a0cc-0b03f653adf5-kube-api-access-sslrc\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048356 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flf6f\" (UniqueName: \"kubernetes.io/projected/e057af5a-bcd5-4612-9c03-350147146c52-kube-api-access-flf6f\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048425 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.048439 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/133993ed-c8ce-4a04-8646-defeb53b9904-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.052354 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e057af5a-bcd5-4612-9c03-350147146c52-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.054201 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e057af5a-bcd5-4612-9c03-350147146c52-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.065218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flf6f\" (UniqueName: \"kubernetes.io/projected/e057af5a-bcd5-4612-9c03-350147146c52-kube-api-access-flf6f\") pod \"nova-cell1-conductor-0\" (UID: \"e057af5a-bcd5-4612-9c03-350147146c52\") " pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.127714 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.145281 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.150281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sslrc\" (UniqueName: \"kubernetes.io/projected/3902ef70-0cb1-462e-a0cc-0b03f653adf5-kube-api-access-sslrc\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.150638 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.150950 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-config-data\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.155621 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-config-data\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.156969 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.168594 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.170875 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.174552 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.174597 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.174552 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.175282 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.177020 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslrc\" (UniqueName: \"kubernetes.io/projected/3902ef70-0cb1-462e-a0cc-0b03f653adf5-kube-api-access-sslrc\") pod \"nova-scheduler-0\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.180509 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.249834 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252712 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-scripts\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252807 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252850 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-run-httpd\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252889 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252935 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-log-httpd\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252958 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgnjq\" (UniqueName: \"kubernetes.io/projected/cc105558-7a16-4b17-9a48-28eecf3dd9ed-kube-api-access-pgnjq\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.252983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.253028 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-config-data\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.357466 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-config-data\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.357909 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-scripts\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.358041 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.358097 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-run-httpd\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.358249 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.358366 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-log-httpd\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.358441 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgnjq\" (UniqueName: \"kubernetes.io/projected/cc105558-7a16-4b17-9a48-28eecf3dd9ed-kube-api-access-pgnjq\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.358466 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.359649 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-run-httpd\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.362860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-log-httpd\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.368948 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.370744 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.378404 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.378460 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-scripts\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.378850 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-config-data\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.387559 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgnjq\" (UniqueName: \"kubernetes.io/projected/cc105558-7a16-4b17-9a48-28eecf3dd9ed-kube-api-access-pgnjq\") pod \"ceilometer-0\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.507509 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.674374 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.811610 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.841680 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e057af5a-bcd5-4612-9c03-350147146c52","Type":"ContainerStarted","Data":"237120db9b0327574c675f7697383b152a5c95821f3844216e558cdfae319004"} Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.844327 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8025b3f-5cba-44b0-9e3e-b965f93104cc","Type":"ContainerStarted","Data":"82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce"} Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.844458 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8025b3f-5cba-44b0-9e3e-b965f93104cc","Type":"ContainerStarted","Data":"3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4"} Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.844554 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8025b3f-5cba-44b0-9e3e-b965f93104cc","Type":"ContainerStarted","Data":"afbd63d2e3ab98c697619224a08cd9676885f83d185bc5da7dc432a1b34da707"} Dec 02 08:07:05 crc kubenswrapper[4691]: I1202 08:07:05.876609 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.8765857970000002 podStartE2EDuration="1.876585797s" podCreationTimestamp="2025-12-02 08:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:05.867433408 +0000 UTC m=+1273.651512300" watchObservedRunningTime="2025-12-02 08:07:05.876585797 +0000 UTC m=+1273.660664659" Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.072623 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:06 crc kubenswrapper[4691]: W1202 08:07:06.075678 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc105558_7a16_4b17_9a48_28eecf3dd9ed.slice/crio-7ce4f319563cfe354969072b76defceb8d555d12db2372f8324ab5322de206f1 WatchSource:0}: Error finding container 7ce4f319563cfe354969072b76defceb8d555d12db2372f8324ab5322de206f1: Status 404 returned error can't find the container with id 7ce4f319563cfe354969072b76defceb8d555d12db2372f8324ab5322de206f1 Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.575437 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="133993ed-c8ce-4a04-8646-defeb53b9904" path="/var/lib/kubelet/pods/133993ed-c8ce-4a04-8646-defeb53b9904/volumes" Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.576506 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac0c01ba-70df-4e67-9582-2984e9fb292c" path="/var/lib/kubelet/pods/ac0c01ba-70df-4e67-9582-2984e9fb292c/volumes" Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.855315 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3902ef70-0cb1-462e-a0cc-0b03f653adf5","Type":"ContainerStarted","Data":"4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0"} Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.855367 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3902ef70-0cb1-462e-a0cc-0b03f653adf5","Type":"ContainerStarted","Data":"8d451749d8bc877c8633a1e74471bba0a36940b94ee6972eb78b4c284f9130ec"} Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.856896 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerStarted","Data":"7ce4f319563cfe354969072b76defceb8d555d12db2372f8324ab5322de206f1"} Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.861273 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"e057af5a-bcd5-4612-9c03-350147146c52","Type":"ContainerStarted","Data":"fd6ee76914959bc747d8a0479caaa889283ff5544857392e28c964f7e602de0a"} Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.861603 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.905296 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.905272229 podStartE2EDuration="2.905272229s" podCreationTimestamp="2025-12-02 08:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:06.888115231 +0000 UTC m=+1274.672194103" watchObservedRunningTime="2025-12-02 08:07:06.905272229 +0000 UTC m=+1274.689351091" Dec 02 08:07:06 crc kubenswrapper[4691]: I1202 08:07:06.918820 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.918800187 podStartE2EDuration="2.918800187s" podCreationTimestamp="2025-12-02 08:07:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:06.910180352 +0000 UTC m=+1274.694259234" watchObservedRunningTime="2025-12-02 08:07:06.918800187 +0000 UTC m=+1274.702879049" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.843589 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.885926 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerStarted","Data":"7c70c2e311c37a9f8ca51014c33f96d246233f466c93740a7470edb5b63d5ed9"} Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.891826 4691 generic.go:334] "Generic (PLEG): container finished" podID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerID="728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f" exitCode=0 Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.891895 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.891946 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bee234f2-733f-4011-8cbc-baa61d2323c0","Type":"ContainerDied","Data":"728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f"} Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.892020 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"bee234f2-733f-4011-8cbc-baa61d2323c0","Type":"ContainerDied","Data":"fa8f32604933cb47f555bb027347b2ffde50333f29421ae18d4b5d3a354ffad8"} Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.892042 4691 scope.go:117] "RemoveContainer" containerID="728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.931495 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-config-data\") pod \"bee234f2-733f-4011-8cbc-baa61d2323c0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.931807 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee234f2-733f-4011-8cbc-baa61d2323c0-logs\") pod \"bee234f2-733f-4011-8cbc-baa61d2323c0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.931895 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4kb7\" (UniqueName: \"kubernetes.io/projected/bee234f2-733f-4011-8cbc-baa61d2323c0-kube-api-access-z4kb7\") pod \"bee234f2-733f-4011-8cbc-baa61d2323c0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.932051 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-combined-ca-bundle\") pod \"bee234f2-733f-4011-8cbc-baa61d2323c0\" (UID: \"bee234f2-733f-4011-8cbc-baa61d2323c0\") " Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.936025 4691 scope.go:117] "RemoveContainer" containerID="387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.936352 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee234f2-733f-4011-8cbc-baa61d2323c0-logs" (OuterVolumeSpecName: "logs") pod "bee234f2-733f-4011-8cbc-baa61d2323c0" (UID: "bee234f2-733f-4011-8cbc-baa61d2323c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.942980 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee234f2-733f-4011-8cbc-baa61d2323c0-kube-api-access-z4kb7" (OuterVolumeSpecName: "kube-api-access-z4kb7") pod "bee234f2-733f-4011-8cbc-baa61d2323c0" (UID: "bee234f2-733f-4011-8cbc-baa61d2323c0"). InnerVolumeSpecName "kube-api-access-z4kb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.965973 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bee234f2-733f-4011-8cbc-baa61d2323c0" (UID: "bee234f2-733f-4011-8cbc-baa61d2323c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:07 crc kubenswrapper[4691]: I1202 08:07:07.975849 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-config-data" (OuterVolumeSpecName: "config-data") pod "bee234f2-733f-4011-8cbc-baa61d2323c0" (UID: "bee234f2-733f-4011-8cbc-baa61d2323c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.020886 4691 scope.go:117] "RemoveContainer" containerID="728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f" Dec 02 08:07:08 crc kubenswrapper[4691]: E1202 08:07:08.021517 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f\": container with ID starting with 728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f not found: ID does not exist" containerID="728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.021575 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f"} err="failed to get container status \"728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f\": rpc error: code = NotFound desc = could not find container \"728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f\": container with ID starting with 728aae6561ab3985f24d7b5bc72872d0072b61d8f3dcd350463ab1a6ac66275f not found: ID does not exist" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.021608 4691 scope.go:117] "RemoveContainer" containerID="387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601" Dec 02 08:07:08 crc kubenswrapper[4691]: E1202 08:07:08.022137 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601\": container with ID starting with 387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601 not found: ID does not exist" containerID="387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.022181 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601"} err="failed to get container status \"387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601\": rpc error: code = NotFound desc = could not find container \"387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601\": container with ID starting with 387c56f5275ae9c7595e21f75b7a380c3ab2e9bf97dfd3dcaed7feb1a61fe601 not found: ID does not exist" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.034648 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.034681 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bee234f2-733f-4011-8cbc-baa61d2323c0-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.034694 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4kb7\" (UniqueName: \"kubernetes.io/projected/bee234f2-733f-4011-8cbc-baa61d2323c0-kube-api-access-z4kb7\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.034705 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bee234f2-733f-4011-8cbc-baa61d2323c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.257894 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.277164 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.292742 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:08 crc kubenswrapper[4691]: E1202 08:07:08.293264 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-api" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.293280 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-api" Dec 02 08:07:08 crc kubenswrapper[4691]: E1202 08:07:08.293312 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-log" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.293321 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-log" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.293518 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-api" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.293543 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" containerName="nova-api-log" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.294802 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.300538 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.313194 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.461027 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsmzd\" (UniqueName: \"kubernetes.io/projected/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-kube-api-access-hsmzd\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.461092 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-logs\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.461277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-config-data\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.461347 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.563248 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsmzd\" (UniqueName: \"kubernetes.io/projected/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-kube-api-access-hsmzd\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.563686 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-logs\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.563743 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-config-data\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.563787 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.564281 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-logs\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.579339 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-config-data\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.579469 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.584275 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bee234f2-733f-4011-8cbc-baa61d2323c0" path="/var/lib/kubelet/pods/bee234f2-733f-4011-8cbc-baa61d2323c0/volumes" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.587637 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsmzd\" (UniqueName: \"kubernetes.io/projected/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-kube-api-access-hsmzd\") pod \"nova-api-0\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.632060 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.920678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerStarted","Data":"a6b7bc983691339c4791cc2c0ac571fa42d67c95bb395884e0221cf8e14ee77e"} Dec 02 08:07:08 crc kubenswrapper[4691]: I1202 08:07:08.920993 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerStarted","Data":"83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc"} Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.235434 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.540457 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.540749 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.936194 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c","Type":"ContainerStarted","Data":"9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b"} Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.936550 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c","Type":"ContainerStarted","Data":"219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a"} Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.936562 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c","Type":"ContainerStarted","Data":"a09c0a8edc6362015fdd4c91b7516a0fe99cbb207046d88db1b774c534c5089e"} Dec 02 08:07:09 crc kubenswrapper[4691]: I1202 08:07:09.979013 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.978992725 podStartE2EDuration="1.978992725s" podCreationTimestamp="2025-12-02 08:07:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:09.95634858 +0000 UTC m=+1277.740427442" watchObservedRunningTime="2025-12-02 08:07:09.978992725 +0000 UTC m=+1277.763071587" Dec 02 08:07:10 crc kubenswrapper[4691]: I1202 08:07:10.205125 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 02 08:07:10 crc kubenswrapper[4691]: I1202 08:07:10.249963 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:07:10 crc kubenswrapper[4691]: I1202 08:07:10.953840 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerStarted","Data":"0407abe7c66b093d0708b94489c7b5275a58ef63d40aba38bb29dc33cb4abf5b"} Dec 02 08:07:10 crc kubenswrapper[4691]: I1202 08:07:10.954181 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:07:10 crc kubenswrapper[4691]: I1202 08:07:10.988480 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.162238315 podStartE2EDuration="5.988460638s" podCreationTimestamp="2025-12-02 08:07:05 +0000 UTC" firstStartedPulling="2025-12-02 08:07:06.078251222 +0000 UTC m=+1273.862330094" lastFinishedPulling="2025-12-02 08:07:09.904473555 +0000 UTC m=+1277.688552417" observedRunningTime="2025-12-02 08:07:10.978013437 +0000 UTC m=+1278.762092309" watchObservedRunningTime="2025-12-02 08:07:10.988460638 +0000 UTC m=+1278.772539490" Dec 02 08:07:11 crc kubenswrapper[4691]: I1202 08:07:11.203116 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 02 08:07:14 crc kubenswrapper[4691]: I1202 08:07:14.539749 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:07:14 crc kubenswrapper[4691]: I1202 08:07:14.540061 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:07:15 crc kubenswrapper[4691]: I1202 08:07:15.250320 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 08:07:15 crc kubenswrapper[4691]: I1202 08:07:15.277845 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 08:07:15 crc kubenswrapper[4691]: I1202 08:07:15.555974 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:15 crc kubenswrapper[4691]: I1202 08:07:15.555974 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:16 crc kubenswrapper[4691]: I1202 08:07:16.071238 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 08:07:18 crc kubenswrapper[4691]: I1202 08:07:18.632678 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:07:18 crc kubenswrapper[4691]: I1202 08:07:18.633061 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:07:19 crc kubenswrapper[4691]: I1202 08:07:19.632819 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:19 crc kubenswrapper[4691]: I1202 08:07:19.674049 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.195:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:24 crc kubenswrapper[4691]: I1202 08:07:24.545807 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:07:24 crc kubenswrapper[4691]: I1202 08:07:24.548984 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:07:24 crc kubenswrapper[4691]: I1202 08:07:24.551644 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:07:25 crc kubenswrapper[4691]: I1202 08:07:25.092644 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.014164 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.105245 4691 generic.go:334] "Generic (PLEG): container finished" podID="86f7acc1-f8b0-4333-8f26-4f003a37e4d9" containerID="f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100" exitCode=137 Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.106037 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.106197 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86f7acc1-f8b0-4333-8f26-4f003a37e4d9","Type":"ContainerDied","Data":"f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100"} Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.106237 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86f7acc1-f8b0-4333-8f26-4f003a37e4d9","Type":"ContainerDied","Data":"d947862d79bba2a4ac766e7238f991eb446aee970bbe00fb21dbc46f6a01ae3d"} Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.106259 4691 scope.go:117] "RemoveContainer" containerID="f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.134762 4691 scope.go:117] "RemoveContainer" containerID="f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100" Dec 02 08:07:27 crc kubenswrapper[4691]: E1202 08:07:27.135554 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100\": container with ID starting with f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100 not found: ID does not exist" containerID="f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.135615 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100"} err="failed to get container status \"f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100\": rpc error: code = NotFound desc = could not find container \"f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100\": container with ID starting with f6d5d7b4cf14d514ebc76e44383025d581e0d587e302da1fdfd0565c1bfcc100 not found: ID does not exist" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.205142 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-kube-api-access-ch5bd\") pod \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.205266 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-combined-ca-bundle\") pod \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.205317 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-config-data\") pod \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\" (UID: \"86f7acc1-f8b0-4333-8f26-4f003a37e4d9\") " Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.212050 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-kube-api-access-ch5bd" (OuterVolumeSpecName: "kube-api-access-ch5bd") pod "86f7acc1-f8b0-4333-8f26-4f003a37e4d9" (UID: "86f7acc1-f8b0-4333-8f26-4f003a37e4d9"). InnerVolumeSpecName "kube-api-access-ch5bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.298377 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86f7acc1-f8b0-4333-8f26-4f003a37e4d9" (UID: "86f7acc1-f8b0-4333-8f26-4f003a37e4d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.298876 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-config-data" (OuterVolumeSpecName: "config-data") pod "86f7acc1-f8b0-4333-8f26-4f003a37e4d9" (UID: "86f7acc1-f8b0-4333-8f26-4f003a37e4d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.310921 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ch5bd\" (UniqueName: \"kubernetes.io/projected/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-kube-api-access-ch5bd\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.310956 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.310971 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86f7acc1-f8b0-4333-8f26-4f003a37e4d9-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.448211 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.459277 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.476178 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:07:27 crc kubenswrapper[4691]: E1202 08:07:27.476608 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f7acc1-f8b0-4333-8f26-4f003a37e4d9" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.476625 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f7acc1-f8b0-4333-8f26-4f003a37e4d9" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.476905 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f7acc1-f8b0-4333-8f26-4f003a37e4d9" containerName="nova-cell1-novncproxy-novncproxy" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.477610 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.481083 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.481336 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.482093 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.488611 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.514012 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.514309 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.514341 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfdf\" (UniqueName: \"kubernetes.io/projected/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-kube-api-access-ggfdf\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.514419 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.514534 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.615101 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.615216 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.615272 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.615343 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.615378 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfdf\" (UniqueName: \"kubernetes.io/projected/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-kube-api-access-ggfdf\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.619715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.619875 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.620235 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.620247 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.634456 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfdf\" (UniqueName: \"kubernetes.io/projected/0dedf0d3-f3c7-4cb1-9003-8ac588994c43-kube-api-access-ggfdf\") pod \"nova-cell1-novncproxy-0\" (UID: \"0dedf0d3-f3c7-4cb1-9003-8ac588994c43\") " pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:27 crc kubenswrapper[4691]: I1202 08:07:27.795393 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:28 crc kubenswrapper[4691]: I1202 08:07:28.252379 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 02 08:07:28 crc kubenswrapper[4691]: I1202 08:07:28.573901 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f7acc1-f8b0-4333-8f26-4f003a37e4d9" path="/var/lib/kubelet/pods/86f7acc1-f8b0-4333-8f26-4f003a37e4d9/volumes" Dec 02 08:07:28 crc kubenswrapper[4691]: I1202 08:07:28.638705 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:07:28 crc kubenswrapper[4691]: I1202 08:07:28.639138 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:07:28 crc kubenswrapper[4691]: I1202 08:07:28.639471 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:07:28 crc kubenswrapper[4691]: I1202 08:07:28.652156 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.127429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0dedf0d3-f3c7-4cb1-9003-8ac588994c43","Type":"ContainerStarted","Data":"6ee76c044bda6fd2350ed8d8eca45440666ec02c828aeabd5a9e74f45b2f1eaf"} Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.127753 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.127805 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0dedf0d3-f3c7-4cb1-9003-8ac588994c43","Type":"ContainerStarted","Data":"1ef29e5eacb1a2fd74c6094fcdeaa988634937321aebe0a10cc45a0379389fd9"} Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.140957 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.157249 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.157224583 podStartE2EDuration="2.157224583s" podCreationTimestamp="2025-12-02 08:07:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:29.151823688 +0000 UTC m=+1296.935902560" watchObservedRunningTime="2025-12-02 08:07:29.157224583 +0000 UTC m=+1296.941303445" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.370578 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-97rjx"] Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.372774 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.405867 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-97rjx"] Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.475042 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.475133 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.475155 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-config\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.475194 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.475223 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtqxl\" (UniqueName: \"kubernetes.io/projected/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-kube-api-access-wtqxl\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.475288 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.577539 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.577644 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.577673 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-config\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.577720 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.578728 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-config\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.578903 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.579040 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.579079 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtqxl\" (UniqueName: \"kubernetes.io/projected/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-kube-api-access-wtqxl\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.579467 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.579592 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.580434 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.601582 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtqxl\" (UniqueName: \"kubernetes.io/projected/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-kube-api-access-wtqxl\") pod \"dnsmasq-dns-89c5cd4d5-97rjx\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:29 crc kubenswrapper[4691]: I1202 08:07:29.717079 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:30 crc kubenswrapper[4691]: I1202 08:07:30.239298 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-97rjx"] Dec 02 08:07:30 crc kubenswrapper[4691]: W1202 08:07:30.242073 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd265ec22_2ef2_49d9_99d5_7b3554b6a32f.slice/crio-ed07e8babc3b6132924bd738ec3ae8c47729cc5b43b9a716f94a79900d89d715 WatchSource:0}: Error finding container ed07e8babc3b6132924bd738ec3ae8c47729cc5b43b9a716f94a79900d89d715: Status 404 returned error can't find the container with id ed07e8babc3b6132924bd738ec3ae8c47729cc5b43b9a716f94a79900d89d715 Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.149607 4691 generic.go:334] "Generic (PLEG): container finished" podID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerID="7b55d608ba646059f356b58cf6be84f7d2172e267c04b75c23af0bde1952396b" exitCode=0 Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.149803 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" event={"ID":"d265ec22-2ef2-49d9-99d5-7b3554b6a32f","Type":"ContainerDied","Data":"7b55d608ba646059f356b58cf6be84f7d2172e267c04b75c23af0bde1952396b"} Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.150200 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" event={"ID":"d265ec22-2ef2-49d9-99d5-7b3554b6a32f","Type":"ContainerStarted","Data":"ed07e8babc3b6132924bd738ec3ae8c47729cc5b43b9a716f94a79900d89d715"} Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.697372 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.826733 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.827183 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-central-agent" containerID="cri-o://7c70c2e311c37a9f8ca51014c33f96d246233f466c93740a7470edb5b63d5ed9" gracePeriod=30 Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.827287 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="proxy-httpd" containerID="cri-o://0407abe7c66b093d0708b94489c7b5275a58ef63d40aba38bb29dc33cb4abf5b" gracePeriod=30 Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.827366 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="sg-core" containerID="cri-o://a6b7bc983691339c4791cc2c0ac571fa42d67c95bb395884e0221cf8e14ee77e" gracePeriod=30 Dec 02 08:07:31 crc kubenswrapper[4691]: I1202 08:07:31.827441 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-notification-agent" containerID="cri-o://83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc" gracePeriod=30 Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.023894 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.194:3000/\": EOF" Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.163489 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerID="a6b7bc983691339c4791cc2c0ac571fa42d67c95bb395884e0221cf8e14ee77e" exitCode=2 Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.163573 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerDied","Data":"a6b7bc983691339c4791cc2c0ac571fa42d67c95bb395884e0221cf8e14ee77e"} Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.166482 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" event={"ID":"d265ec22-2ef2-49d9-99d5-7b3554b6a32f","Type":"ContainerStarted","Data":"b08387eed4168dd295210e3e140ea450d30f3d2fa31d69fc3b9bb8982ad8824f"} Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.166553 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-log" containerID="cri-o://219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a" gracePeriod=30 Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.166693 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-api" containerID="cri-o://9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b" gracePeriod=30 Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.191748 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" podStartSLOduration=3.191729221 podStartE2EDuration="3.191729221s" podCreationTimestamp="2025-12-02 08:07:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:32.189131066 +0000 UTC m=+1299.973209928" watchObservedRunningTime="2025-12-02 08:07:32.191729221 +0000 UTC m=+1299.975808083" Dec 02 08:07:32 crc kubenswrapper[4691]: I1202 08:07:32.796074 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.178401 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerID="0407abe7c66b093d0708b94489c7b5275a58ef63d40aba38bb29dc33cb4abf5b" exitCode=0 Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.179710 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerID="7c70c2e311c37a9f8ca51014c33f96d246233f466c93740a7470edb5b63d5ed9" exitCode=0 Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.178490 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerDied","Data":"0407abe7c66b093d0708b94489c7b5275a58ef63d40aba38bb29dc33cb4abf5b"} Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.179909 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerDied","Data":"7c70c2e311c37a9f8ca51014c33f96d246233f466c93740a7470edb5b63d5ed9"} Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.182105 4691 generic.go:334] "Generic (PLEG): container finished" podID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerID="219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a" exitCode=143 Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.182176 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c","Type":"ContainerDied","Data":"219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a"} Dec 02 08:07:33 crc kubenswrapper[4691]: I1202 08:07:33.182590 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:35 crc kubenswrapper[4691]: E1202 08:07:35.167863 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc105558_7a16_4b17_9a48_28eecf3dd9ed.slice/crio-conmon-83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc105558_7a16_4b17_9a48_28eecf3dd9ed.slice/crio-83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.207178 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerID="83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc" exitCode=0 Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.207234 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerDied","Data":"83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc"} Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.585272 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.706972 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgnjq\" (UniqueName: \"kubernetes.io/projected/cc105558-7a16-4b17-9a48-28eecf3dd9ed-kube-api-access-pgnjq\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.707042 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-log-httpd\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.707068 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-scripts\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.707110 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-ceilometer-tls-certs\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.707132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-sg-core-conf-yaml\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.708004 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-config-data\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.708067 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-run-httpd\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.708154 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-combined-ca-bundle\") pod \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\" (UID: \"cc105558-7a16-4b17-9a48-28eecf3dd9ed\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.709223 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.709324 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.711219 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.712349 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.712375 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc105558-7a16-4b17-9a48-28eecf3dd9ed-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.714980 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-scripts" (OuterVolumeSpecName: "scripts") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.715429 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc105558-7a16-4b17-9a48-28eecf3dd9ed-kube-api-access-pgnjq" (OuterVolumeSpecName: "kube-api-access-pgnjq") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "kube-api-access-pgnjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.768661 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.791101 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.813946 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-config-data\") pod \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814142 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsmzd\" (UniqueName: \"kubernetes.io/projected/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-kube-api-access-hsmzd\") pod \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814220 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-combined-ca-bundle\") pod \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814269 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-logs\") pod \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\" (UID: \"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c\") " Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814939 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgnjq\" (UniqueName: \"kubernetes.io/projected/cc105558-7a16-4b17-9a48-28eecf3dd9ed-kube-api-access-pgnjq\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814957 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814967 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.814976 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.815140 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-logs" (OuterVolumeSpecName: "logs") pod "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" (UID: "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.828604 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-kube-api-access-hsmzd" (OuterVolumeSpecName: "kube-api-access-hsmzd") pod "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" (UID: "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c"). InnerVolumeSpecName "kube-api-access-hsmzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.848821 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-config-data" (OuterVolumeSpecName: "config-data") pod "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" (UID: "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.850705 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" (UID: "9780ec75-20c3-49c3-9b7c-c5fe68dfb96c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.854272 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.864074 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-config-data" (OuterVolumeSpecName: "config-data") pod "cc105558-7a16-4b17-9a48-28eecf3dd9ed" (UID: "cc105558-7a16-4b17-9a48-28eecf3dd9ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.917211 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.917252 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.917264 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsmzd\" (UniqueName: \"kubernetes.io/projected/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-kube-api-access-hsmzd\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.917282 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc105558-7a16-4b17-9a48-28eecf3dd9ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.917292 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:35 crc kubenswrapper[4691]: I1202 08:07:35.917301 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.221267 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.221173 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc105558-7a16-4b17-9a48-28eecf3dd9ed","Type":"ContainerDied","Data":"7ce4f319563cfe354969072b76defceb8d555d12db2372f8324ab5322de206f1"} Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.221664 4691 scope.go:117] "RemoveContainer" containerID="0407abe7c66b093d0708b94489c7b5275a58ef63d40aba38bb29dc33cb4abf5b" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.224784 4691 generic.go:334] "Generic (PLEG): container finished" podID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerID="9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b" exitCode=0 Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.224837 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c","Type":"ContainerDied","Data":"9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b"} Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.224873 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9780ec75-20c3-49c3-9b7c-c5fe68dfb96c","Type":"ContainerDied","Data":"a09c0a8edc6362015fdd4c91b7516a0fe99cbb207046d88db1b774c534c5089e"} Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.224971 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.261486 4691 scope.go:117] "RemoveContainer" containerID="a6b7bc983691339c4791cc2c0ac571fa42d67c95bb395884e0221cf8e14ee77e" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.286520 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.303682 4691 scope.go:117] "RemoveContainer" containerID="83da59dbccc688f6c4a34aa381b2294e1b55e4237aa6aac282f7ca40a81fe7fc" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.307463 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.323931 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.338234 4691 scope.go:117] "RemoveContainer" containerID="7c70c2e311c37a9f8ca51014c33f96d246233f466c93740a7470edb5b63d5ed9" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.341970 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381004 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.381618 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-notification-agent" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381641 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-notification-agent" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.381655 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-log" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381663 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-log" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.381695 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-central-agent" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381706 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-central-agent" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.381729 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="proxy-httpd" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381737 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="proxy-httpd" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.381779 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="sg-core" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381787 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="sg-core" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.381802 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-api" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.381808 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-api" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.382052 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-api" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.382071 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="sg-core" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.382099 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" containerName="nova-api-log" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.382114 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="proxy-httpd" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.382127 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-central-agent" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.382143 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="ceilometer-notification-agent" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.384561 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.386813 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.387107 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.387245 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.391562 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.393926 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.396124 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.396126 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.396621 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.404139 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.414278 4691 scope.go:117] "RemoveContainer" containerID="9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.418039 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.436016 4691 scope.go:117] "RemoveContainer" containerID="219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.456578 4691 scope.go:117] "RemoveContainer" containerID="9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.457133 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b\": container with ID starting with 9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b not found: ID does not exist" containerID="9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.457180 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b"} err="failed to get container status \"9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b\": rpc error: code = NotFound desc = could not find container \"9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b\": container with ID starting with 9fcd95c97b62a82cff54a4c5bed37d2f1fcbfe2957138aa29e8e783c9846360b not found: ID does not exist" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.457208 4691 scope.go:117] "RemoveContainer" containerID="219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a" Dec 02 08:07:36 crc kubenswrapper[4691]: E1202 08:07:36.457884 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a\": container with ID starting with 219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a not found: ID does not exist" containerID="219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.457926 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a"} err="failed to get container status \"219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a\": rpc error: code = NotFound desc = could not find container \"219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a\": container with ID starting with 219740d6caf94d68435b80e1e7ad72ff87773cef00e7d8c4e529c88e52d7048a not found: ID does not exist" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.529708 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530023 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530090 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-config-data\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530213 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530260 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530297 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-run-httpd\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530445 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-log-httpd\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530500 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530558 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmjvp\" (UniqueName: \"kubernetes.io/projected/48ca331e-d631-4edd-ad23-1c22786e88fc-kube-api-access-mmjvp\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530589 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48ca331e-d631-4edd-ad23-1c22786e88fc-logs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530716 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtcqm\" (UniqueName: \"kubernetes.io/projected/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-kube-api-access-qtcqm\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530823 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-scripts\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530893 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.530917 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-config-data\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.574829 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9780ec75-20c3-49c3-9b7c-c5fe68dfb96c" path="/var/lib/kubelet/pods/9780ec75-20c3-49c3-9b7c-c5fe68dfb96c/volumes" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.575677 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" path="/var/lib/kubelet/pods/cc105558-7a16-4b17-9a48-28eecf3dd9ed/volumes" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.632512 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.632567 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.632598 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-run-httpd\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.632650 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-log-httpd\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.632672 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.632707 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmjvp\" (UniqueName: \"kubernetes.io/projected/48ca331e-d631-4edd-ad23-1c22786e88fc-kube-api-access-mmjvp\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633237 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-log-httpd\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633258 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-run-httpd\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633571 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48ca331e-d631-4edd-ad23-1c22786e88fc-logs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633650 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtcqm\" (UniqueName: \"kubernetes.io/projected/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-kube-api-access-qtcqm\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633725 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-scripts\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633808 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633829 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-config-data\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.633872 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.634007 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.634048 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-config-data\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.635083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48ca331e-d631-4edd-ad23-1c22786e88fc-logs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.639287 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-config-data\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.639913 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.640192 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.641345 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.641934 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-public-tls-certs\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.642043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-config-data\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.654793 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.654992 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.656262 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-scripts\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.658953 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtcqm\" (UniqueName: \"kubernetes.io/projected/79ed78c7-a8cc-4ad0-a0cc-38c0f226df93-kube-api-access-qtcqm\") pod \"ceilometer-0\" (UID: \"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93\") " pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.661671 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmjvp\" (UniqueName: \"kubernetes.io/projected/48ca331e-d631-4edd-ad23-1c22786e88fc-kube-api-access-mmjvp\") pod \"nova-api-0\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " pod="openstack/nova-api-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.712283 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 02 08:07:36 crc kubenswrapper[4691]: I1202 08:07:36.728737 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:37 crc kubenswrapper[4691]: I1202 08:07:37.231001 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 02 08:07:37 crc kubenswrapper[4691]: I1202 08:07:37.372574 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:37 crc kubenswrapper[4691]: W1202 08:07:37.377884 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ca331e_d631_4edd_ad23_1c22786e88fc.slice/crio-f47364d533ca8ecc825b1044dac64ccae786daf8d53bde4483f24006673ebd68 WatchSource:0}: Error finding container f47364d533ca8ecc825b1044dac64ccae786daf8d53bde4483f24006673ebd68: Status 404 returned error can't find the container with id f47364d533ca8ecc825b1044dac64ccae786daf8d53bde4483f24006673ebd68 Dec 02 08:07:37 crc kubenswrapper[4691]: I1202 08:07:37.795955 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:37 crc kubenswrapper[4691]: I1202 08:07:37.816103 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.256005 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48ca331e-d631-4edd-ad23-1c22786e88fc","Type":"ContainerStarted","Data":"7363c888be5122cf7e0ccadcdfef767102b6310a6a92b96c96bcc3f5049ebd44"} Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.256074 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48ca331e-d631-4edd-ad23-1c22786e88fc","Type":"ContainerStarted","Data":"ea911e49461a7e2f43ff2653be03ce22c48d21e7227f15c6d8ff229d50445f59"} Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.256084 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48ca331e-d631-4edd-ad23-1c22786e88fc","Type":"ContainerStarted","Data":"f47364d533ca8ecc825b1044dac64ccae786daf8d53bde4483f24006673ebd68"} Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.259623 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93","Type":"ContainerStarted","Data":"87efaba0079146186b50954207b27c3d06d5c0dfa5361d8bc59617d884e522cb"} Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.259662 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93","Type":"ContainerStarted","Data":"76d86d4b7b7a880decb3a6d4db2c70c964c83d555a708433b16b65fee8d0560f"} Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.278600 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.285671 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.2856465999999998 podStartE2EDuration="2.2856466s" podCreationTimestamp="2025-12-02 08:07:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:38.273591739 +0000 UTC m=+1306.057670601" watchObservedRunningTime="2025-12-02 08:07:38.2856466 +0000 UTC m=+1306.069725462" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.605905 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-kbjvb"] Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.607366 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.611904 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.613024 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.618864 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbjvb"] Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.755029 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-config-data\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.755418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.755634 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-scripts\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.755667 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmlkx\" (UniqueName: \"kubernetes.io/projected/86775270-21b6-4ffa-a279-8bc76a6ca396-kube-api-access-gmlkx\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.857496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmlkx\" (UniqueName: \"kubernetes.io/projected/86775270-21b6-4ffa-a279-8bc76a6ca396-kube-api-access-gmlkx\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.857603 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-config-data\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.857649 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.857786 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-scripts\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.864283 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.866227 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-scripts\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.870416 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-config-data\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.877380 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmlkx\" (UniqueName: \"kubernetes.io/projected/86775270-21b6-4ffa-a279-8bc76a6ca396-kube-api-access-gmlkx\") pod \"nova-cell1-cell-mapping-kbjvb\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:38 crc kubenswrapper[4691]: I1202 08:07:38.936557 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:39 crc kubenswrapper[4691]: I1202 08:07:39.282072 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93","Type":"ContainerStarted","Data":"e6cc0e3db739b735eace70131773741e0581a8e792ade661987ce5222bee1a81"} Dec 02 08:07:39 crc kubenswrapper[4691]: W1202 08:07:39.459605 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86775270_21b6_4ffa_a279_8bc76a6ca396.slice/crio-e394df47db9ab78d3f1426c25d1ef9b5adc27e9123f40013bf10e8b03d910e3b WatchSource:0}: Error finding container e394df47db9ab78d3f1426c25d1ef9b5adc27e9123f40013bf10e8b03d910e3b: Status 404 returned error can't find the container with id e394df47db9ab78d3f1426c25d1ef9b5adc27e9123f40013bf10e8b03d910e3b Dec 02 08:07:39 crc kubenswrapper[4691]: I1202 08:07:39.463503 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbjvb"] Dec 02 08:07:39 crc kubenswrapper[4691]: I1202 08:07:39.720040 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:07:39 crc kubenswrapper[4691]: I1202 08:07:39.791386 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckjgd"] Dec 02 08:07:39 crc kubenswrapper[4691]: I1202 08:07:39.791721 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="dnsmasq-dns" containerID="cri-o://e3ffb8fee44d8f92819e5d3d4c4aa81a51986252f008b7e59d26cdcdf348fa0c" gracePeriod=10 Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.299728 4691 generic.go:334] "Generic (PLEG): container finished" podID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerID="e3ffb8fee44d8f92819e5d3d4c4aa81a51986252f008b7e59d26cdcdf348fa0c" exitCode=0 Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.299820 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" event={"ID":"5402d84d-ce2a-4a49-810e-600f6bfa5fef","Type":"ContainerDied","Data":"e3ffb8fee44d8f92819e5d3d4c4aa81a51986252f008b7e59d26cdcdf348fa0c"} Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.300255 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" event={"ID":"5402d84d-ce2a-4a49-810e-600f6bfa5fef","Type":"ContainerDied","Data":"14a01ef08bdcc213776ad509536df0d4abc6e656f0620353e8cac27a53e7afc9"} Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.300284 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14a01ef08bdcc213776ad509536df0d4abc6e656f0620353e8cac27a53e7afc9" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.299944 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.305152 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93","Type":"ContainerStarted","Data":"e659be66dde4ac6dcf71d39a07bcc005ee64d00e98fb40a89b87f73c65677acf"} Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.307925 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbjvb" event={"ID":"86775270-21b6-4ffa-a279-8bc76a6ca396","Type":"ContainerStarted","Data":"8a7c013cd08c51aa6b83f3c007a59d89c34b491d8cc38c528c437d2190932fe0"} Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.307971 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbjvb" event={"ID":"86775270-21b6-4ffa-a279-8bc76a6ca396","Type":"ContainerStarted","Data":"e394df47db9ab78d3f1426c25d1ef9b5adc27e9123f40013bf10e8b03d910e3b"} Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.369511 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-kbjvb" podStartSLOduration=2.369487524 podStartE2EDuration="2.369487524s" podCreationTimestamp="2025-12-02 08:07:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:40.363884534 +0000 UTC m=+1308.147963406" watchObservedRunningTime="2025-12-02 08:07:40.369487524 +0000 UTC m=+1308.153566386" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.505643 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-config\") pod \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.505792 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9lbkz\" (UniqueName: \"kubernetes.io/projected/5402d84d-ce2a-4a49-810e-600f6bfa5fef-kube-api-access-9lbkz\") pod \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.505835 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-sb\") pod \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.505890 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-svc\") pod \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.505943 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-nb\") pod \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.506028 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-swift-storage-0\") pod \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\" (UID: \"5402d84d-ce2a-4a49-810e-600f6bfa5fef\") " Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.520754 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5402d84d-ce2a-4a49-810e-600f6bfa5fef-kube-api-access-9lbkz" (OuterVolumeSpecName: "kube-api-access-9lbkz") pod "5402d84d-ce2a-4a49-810e-600f6bfa5fef" (UID: "5402d84d-ce2a-4a49-810e-600f6bfa5fef"). InnerVolumeSpecName "kube-api-access-9lbkz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.565614 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5402d84d-ce2a-4a49-810e-600f6bfa5fef" (UID: "5402d84d-ce2a-4a49-810e-600f6bfa5fef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.584228 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5402d84d-ce2a-4a49-810e-600f6bfa5fef" (UID: "5402d84d-ce2a-4a49-810e-600f6bfa5fef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.589730 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5402d84d-ce2a-4a49-810e-600f6bfa5fef" (UID: "5402d84d-ce2a-4a49-810e-600f6bfa5fef"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.615472 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-config" (OuterVolumeSpecName: "config") pod "5402d84d-ce2a-4a49-810e-600f6bfa5fef" (UID: "5402d84d-ce2a-4a49-810e-600f6bfa5fef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.616924 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9lbkz\" (UniqueName: \"kubernetes.io/projected/5402d84d-ce2a-4a49-810e-600f6bfa5fef-kube-api-access-9lbkz\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.616965 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.616979 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.616989 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.617002 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.658868 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5402d84d-ce2a-4a49-810e-600f6bfa5fef" (UID: "5402d84d-ce2a-4a49-810e-600f6bfa5fef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:07:40 crc kubenswrapper[4691]: I1202 08:07:40.720873 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5402d84d-ce2a-4a49-810e-600f6bfa5fef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:41 crc kubenswrapper[4691]: I1202 08:07:41.317162 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" Dec 02 08:07:41 crc kubenswrapper[4691]: I1202 08:07:41.364725 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckjgd"] Dec 02 08:07:41 crc kubenswrapper[4691]: I1202 08:07:41.379412 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-ckjgd"] Dec 02 08:07:42 crc kubenswrapper[4691]: I1202 08:07:42.330276 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"79ed78c7-a8cc-4ad0-a0cc-38c0f226df93","Type":"ContainerStarted","Data":"eb781ec62702c4dbd721e50889a0cad7f913c5adbb79cbd649a8f681438c1831"} Dec 02 08:07:42 crc kubenswrapper[4691]: I1202 08:07:42.330645 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 02 08:07:42 crc kubenswrapper[4691]: I1202 08:07:42.379746 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.466767851 podStartE2EDuration="6.37971897s" podCreationTimestamp="2025-12-02 08:07:36 +0000 UTC" firstStartedPulling="2025-12-02 08:07:37.231824901 +0000 UTC m=+1305.015903763" lastFinishedPulling="2025-12-02 08:07:41.14477602 +0000 UTC m=+1308.928854882" observedRunningTime="2025-12-02 08:07:42.376311315 +0000 UTC m=+1310.160390197" watchObservedRunningTime="2025-12-02 08:07:42.37971897 +0000 UTC m=+1310.163797832" Dec 02 08:07:42 crc kubenswrapper[4691]: I1202 08:07:42.573577 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" path="/var/lib/kubelet/pods/5402d84d-ce2a-4a49-810e-600f6bfa5fef/volumes" Dec 02 08:07:45 crc kubenswrapper[4691]: I1202 08:07:45.163770 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-757b4f8459-ckjgd" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.186:5353: i/o timeout" Dec 02 08:07:46 crc kubenswrapper[4691]: I1202 08:07:46.376523 4691 generic.go:334] "Generic (PLEG): container finished" podID="86775270-21b6-4ffa-a279-8bc76a6ca396" containerID="8a7c013cd08c51aa6b83f3c007a59d89c34b491d8cc38c528c437d2190932fe0" exitCode=0 Dec 02 08:07:46 crc kubenswrapper[4691]: I1202 08:07:46.376595 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbjvb" event={"ID":"86775270-21b6-4ffa-a279-8bc76a6ca396","Type":"ContainerDied","Data":"8a7c013cd08c51aa6b83f3c007a59d89c34b491d8cc38c528c437d2190932fe0"} Dec 02 08:07:46 crc kubenswrapper[4691]: I1202 08:07:46.730429 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:07:46 crc kubenswrapper[4691]: I1202 08:07:46.730473 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.742922 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.742922 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.199:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.781111 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.870833 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmlkx\" (UniqueName: \"kubernetes.io/projected/86775270-21b6-4ffa-a279-8bc76a6ca396-kube-api-access-gmlkx\") pod \"86775270-21b6-4ffa-a279-8bc76a6ca396\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.871362 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-combined-ca-bundle\") pod \"86775270-21b6-4ffa-a279-8bc76a6ca396\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.871497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-scripts\") pod \"86775270-21b6-4ffa-a279-8bc76a6ca396\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.871536 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-config-data\") pod \"86775270-21b6-4ffa-a279-8bc76a6ca396\" (UID: \"86775270-21b6-4ffa-a279-8bc76a6ca396\") " Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.877039 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-scripts" (OuterVolumeSpecName: "scripts") pod "86775270-21b6-4ffa-a279-8bc76a6ca396" (UID: "86775270-21b6-4ffa-a279-8bc76a6ca396"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.877144 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86775270-21b6-4ffa-a279-8bc76a6ca396-kube-api-access-gmlkx" (OuterVolumeSpecName: "kube-api-access-gmlkx") pod "86775270-21b6-4ffa-a279-8bc76a6ca396" (UID: "86775270-21b6-4ffa-a279-8bc76a6ca396"). InnerVolumeSpecName "kube-api-access-gmlkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.900869 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86775270-21b6-4ffa-a279-8bc76a6ca396" (UID: "86775270-21b6-4ffa-a279-8bc76a6ca396"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.908058 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-config-data" (OuterVolumeSpecName: "config-data") pod "86775270-21b6-4ffa-a279-8bc76a6ca396" (UID: "86775270-21b6-4ffa-a279-8bc76a6ca396"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.973936 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-scripts\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.973979 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.973991 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmlkx\" (UniqueName: \"kubernetes.io/projected/86775270-21b6-4ffa-a279-8bc76a6ca396-kube-api-access-gmlkx\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:47 crc kubenswrapper[4691]: I1202 08:07:47.974040 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86775270-21b6-4ffa-a279-8bc76a6ca396-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.396140 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-kbjvb" event={"ID":"86775270-21b6-4ffa-a279-8bc76a6ca396","Type":"ContainerDied","Data":"e394df47db9ab78d3f1426c25d1ef9b5adc27e9123f40013bf10e8b03d910e3b"} Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.396183 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e394df47db9ab78d3f1426c25d1ef9b5adc27e9123f40013bf10e8b03d910e3b" Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.396254 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-kbjvb" Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.629818 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.630165 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-log" containerID="cri-o://ea911e49461a7e2f43ff2653be03ce22c48d21e7227f15c6d8ff229d50445f59" gracePeriod=30 Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.630725 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-api" containerID="cri-o://7363c888be5122cf7e0ccadcdfef767102b6310a6a92b96c96bcc3f5049ebd44" gracePeriod=30 Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.652257 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.652551 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" containerName="nova-scheduler-scheduler" containerID="cri-o://4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0" gracePeriod=30 Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.662591 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.662920 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-log" containerID="cri-o://3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4" gracePeriod=30 Dec 02 08:07:48 crc kubenswrapper[4691]: I1202 08:07:48.663002 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-metadata" containerID="cri-o://82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce" gracePeriod=30 Dec 02 08:07:50 crc kubenswrapper[4691]: E1202 08:07:50.251201 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0 is running failed: container process not found" containerID="4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 08:07:50 crc kubenswrapper[4691]: E1202 08:07:50.251877 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0 is running failed: container process not found" containerID="4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 08:07:50 crc kubenswrapper[4691]: E1202 08:07:50.252402 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0 is running failed: container process not found" containerID="4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 02 08:07:50 crc kubenswrapper[4691]: E1202 08:07:50.252441 4691 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" containerName="nova-scheduler-scheduler" Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.433538 4691 generic.go:334] "Generic (PLEG): container finished" podID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerID="ea911e49461a7e2f43ff2653be03ce22c48d21e7227f15c6d8ff229d50445f59" exitCode=143 Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.433609 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48ca331e-d631-4edd-ad23-1c22786e88fc","Type":"ContainerDied","Data":"ea911e49461a7e2f43ff2653be03ce22c48d21e7227f15c6d8ff229d50445f59"} Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.436797 4691 generic.go:334] "Generic (PLEG): container finished" podID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" containerID="4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0" exitCode=0 Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.436862 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3902ef70-0cb1-462e-a0cc-0b03f653adf5","Type":"ContainerDied","Data":"4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0"} Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.439147 4691 generic.go:334] "Generic (PLEG): container finished" podID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerID="3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4" exitCode=143 Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.439170 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8025b3f-5cba-44b0-9e3e-b965f93104cc","Type":"ContainerDied","Data":"3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4"} Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.747034 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.937144 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sslrc\" (UniqueName: \"kubernetes.io/projected/3902ef70-0cb1-462e-a0cc-0b03f653adf5-kube-api-access-sslrc\") pod \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.937219 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-combined-ca-bundle\") pod \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.937259 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-config-data\") pod \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\" (UID: \"3902ef70-0cb1-462e-a0cc-0b03f653adf5\") " Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.945709 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3902ef70-0cb1-462e-a0cc-0b03f653adf5-kube-api-access-sslrc" (OuterVolumeSpecName: "kube-api-access-sslrc") pod "3902ef70-0cb1-462e-a0cc-0b03f653adf5" (UID: "3902ef70-0cb1-462e-a0cc-0b03f653adf5"). InnerVolumeSpecName "kube-api-access-sslrc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.972355 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-config-data" (OuterVolumeSpecName: "config-data") pod "3902ef70-0cb1-462e-a0cc-0b03f653adf5" (UID: "3902ef70-0cb1-462e-a0cc-0b03f653adf5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:50 crc kubenswrapper[4691]: I1202 08:07:50.973505 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3902ef70-0cb1-462e-a0cc-0b03f653adf5" (UID: "3902ef70-0cb1-462e-a0cc-0b03f653adf5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.039962 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sslrc\" (UniqueName: \"kubernetes.io/projected/3902ef70-0cb1-462e-a0cc-0b03f653adf5-kube-api-access-sslrc\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.040000 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.040013 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3902ef70-0cb1-462e-a0cc-0b03f653adf5-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.450524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3902ef70-0cb1-462e-a0cc-0b03f653adf5","Type":"ContainerDied","Data":"8d451749d8bc877c8633a1e74471bba0a36940b94ee6972eb78b4c284f9130ec"} Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.450595 4691 scope.go:117] "RemoveContainer" containerID="4e973a6a4fc653a7cf0d5f4a7c07a08ab016cdd8b5c935dfd5123b5099c943a0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.450724 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.490191 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.508259 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.519622 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:51 crc kubenswrapper[4691]: E1202 08:07:51.520297 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="dnsmasq-dns" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520323 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="dnsmasq-dns" Dec 02 08:07:51 crc kubenswrapper[4691]: E1202 08:07:51.520343 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" containerName="nova-scheduler-scheduler" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520354 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" containerName="nova-scheduler-scheduler" Dec 02 08:07:51 crc kubenswrapper[4691]: E1202 08:07:51.520370 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86775270-21b6-4ffa-a279-8bc76a6ca396" containerName="nova-manage" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520399 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="86775270-21b6-4ffa-a279-8bc76a6ca396" containerName="nova-manage" Dec 02 08:07:51 crc kubenswrapper[4691]: E1202 08:07:51.520416 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="init" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520424 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="init" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520661 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" containerName="nova-scheduler-scheduler" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520689 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="86775270-21b6-4ffa-a279-8bc76a6ca396" containerName="nova-manage" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.520707 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5402d84d-ce2a-4a49-810e-600f6bfa5fef" containerName="dnsmasq-dns" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.521577 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.526298 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.530183 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.650973 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff86164-f22f-49e6-8933-e599da966506-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.651272 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtl9b\" (UniqueName: \"kubernetes.io/projected/6ff86164-f22f-49e6-8933-e599da966506-kube-api-access-rtl9b\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.651518 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff86164-f22f-49e6-8933-e599da966506-config-data\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.753675 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff86164-f22f-49e6-8933-e599da966506-config-data\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.753910 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff86164-f22f-49e6-8933-e599da966506-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.753976 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtl9b\" (UniqueName: \"kubernetes.io/projected/6ff86164-f22f-49e6-8933-e599da966506-kube-api-access-rtl9b\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.758586 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ff86164-f22f-49e6-8933-e599da966506-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.758683 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ff86164-f22f-49e6-8933-e599da966506-config-data\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.772696 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtl9b\" (UniqueName: \"kubernetes.io/projected/6ff86164-f22f-49e6-8933-e599da966506-kube-api-access-rtl9b\") pod \"nova-scheduler-0\" (UID: \"6ff86164-f22f-49e6-8933-e599da966506\") " pod="openstack/nova-scheduler-0" Dec 02 08:07:51 crc kubenswrapper[4691]: I1202 08:07:51.848326 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 02 08:07:52 crc kubenswrapper[4691]: I1202 08:07:52.318787 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 02 08:07:52 crc kubenswrapper[4691]: W1202 08:07:52.318931 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ff86164_f22f_49e6_8933_e599da966506.slice/crio-a0565c932bd6286858c8f77137e4f384b9475a0a84ef6a82053d6acf24d47eee WatchSource:0}: Error finding container a0565c932bd6286858c8f77137e4f384b9475a0a84ef6a82053d6acf24d47eee: Status 404 returned error can't find the container with id a0565c932bd6286858c8f77137e4f384b9475a0a84ef6a82053d6acf24d47eee Dec 02 08:07:52 crc kubenswrapper[4691]: I1202 08:07:52.459299 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ff86164-f22f-49e6-8933-e599da966506","Type":"ContainerStarted","Data":"a0565c932bd6286858c8f77137e4f384b9475a0a84ef6a82053d6acf24d47eee"} Dec 02 08:07:52 crc kubenswrapper[4691]: I1202 08:07:52.511746 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:47168->10.217.0.191:8775: read: connection reset by peer" Dec 02 08:07:52 crc kubenswrapper[4691]: I1202 08:07:52.511801 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.191:8775/\": read tcp 10.217.0.2:47170->10.217.0.191:8775: read: connection reset by peer" Dec 02 08:07:52 crc kubenswrapper[4691]: I1202 08:07:52.573967 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3902ef70-0cb1-462e-a0cc-0b03f653adf5" path="/var/lib/kubelet/pods/3902ef70-0cb1-462e-a0cc-0b03f653adf5/volumes" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.003583 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.183162 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8025b3f-5cba-44b0-9e3e-b965f93104cc-logs\") pod \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.183380 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-nova-metadata-tls-certs\") pod \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.183428 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-combined-ca-bundle\") pod \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.183469 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-config-data\") pod \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.183585 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hd78\" (UniqueName: \"kubernetes.io/projected/d8025b3f-5cba-44b0-9e3e-b965f93104cc-kube-api-access-4hd78\") pod \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\" (UID: \"d8025b3f-5cba-44b0-9e3e-b965f93104cc\") " Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.184039 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8025b3f-5cba-44b0-9e3e-b965f93104cc-logs" (OuterVolumeSpecName: "logs") pod "d8025b3f-5cba-44b0-9e3e-b965f93104cc" (UID: "d8025b3f-5cba-44b0-9e3e-b965f93104cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.201100 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8025b3f-5cba-44b0-9e3e-b965f93104cc-kube-api-access-4hd78" (OuterVolumeSpecName: "kube-api-access-4hd78") pod "d8025b3f-5cba-44b0-9e3e-b965f93104cc" (UID: "d8025b3f-5cba-44b0-9e3e-b965f93104cc"). InnerVolumeSpecName "kube-api-access-4hd78". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.223316 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-config-data" (OuterVolumeSpecName: "config-data") pod "d8025b3f-5cba-44b0-9e3e-b965f93104cc" (UID: "d8025b3f-5cba-44b0-9e3e-b965f93104cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.231473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d8025b3f-5cba-44b0-9e3e-b965f93104cc" (UID: "d8025b3f-5cba-44b0-9e3e-b965f93104cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.286056 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hd78\" (UniqueName: \"kubernetes.io/projected/d8025b3f-5cba-44b0-9e3e-b965f93104cc-kube-api-access-4hd78\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.286096 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d8025b3f-5cba-44b0-9e3e-b965f93104cc-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.286109 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.286121 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.314409 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d8025b3f-5cba-44b0-9e3e-b965f93104cc" (UID: "d8025b3f-5cba-44b0-9e3e-b965f93104cc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.393572 4691 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d8025b3f-5cba-44b0-9e3e-b965f93104cc-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.476159 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6ff86164-f22f-49e6-8933-e599da966506","Type":"ContainerStarted","Data":"fe5289aa63786465f926189b49111f69ad41e82b42b7b21ecedb4793dbe8792f"} Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.481477 4691 generic.go:334] "Generic (PLEG): container finished" podID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerID="82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce" exitCode=0 Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.481524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8025b3f-5cba-44b0-9e3e-b965f93104cc","Type":"ContainerDied","Data":"82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce"} Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.481556 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d8025b3f-5cba-44b0-9e3e-b965f93104cc","Type":"ContainerDied","Data":"afbd63d2e3ab98c697619224a08cd9676885f83d185bc5da7dc432a1b34da707"} Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.481574 4691 scope.go:117] "RemoveContainer" containerID="82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.481768 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.504032 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.504012046 podStartE2EDuration="2.504012046s" podCreationTimestamp="2025-12-02 08:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:53.501710138 +0000 UTC m=+1321.285789020" watchObservedRunningTime="2025-12-02 08:07:53.504012046 +0000 UTC m=+1321.288090908" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.530032 4691 scope.go:117] "RemoveContainer" containerID="3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.532383 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.546970 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.556421 4691 scope.go:117] "RemoveContainer" containerID="82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce" Dec 02 08:07:53 crc kubenswrapper[4691]: E1202 08:07:53.557123 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce\": container with ID starting with 82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce not found: ID does not exist" containerID="82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.557161 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce"} err="failed to get container status \"82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce\": rpc error: code = NotFound desc = could not find container \"82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce\": container with ID starting with 82215295896c439d0d9a244209a2a1dbcc37d8952509d6c676b7716a075443ce not found: ID does not exist" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.557183 4691 scope.go:117] "RemoveContainer" containerID="3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.557348 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:53 crc kubenswrapper[4691]: E1202 08:07:53.557787 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-log" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.557828 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-log" Dec 02 08:07:53 crc kubenswrapper[4691]: E1202 08:07:53.557859 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-metadata" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.557866 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-metadata" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.558081 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-metadata" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.558114 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" containerName="nova-metadata-log" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.559227 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: E1202 08:07:53.562033 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4\": container with ID starting with 3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4 not found: ID does not exist" containerID="3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.562087 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4"} err="failed to get container status \"3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4\": rpc error: code = NotFound desc = could not find container \"3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4\": container with ID starting with 3c81edf8b6a3c2b6a083d2f6f800f911319097a59b10301a63c19ff2957c5af4 not found: ID does not exist" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.564225 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.564514 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.574578 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.699530 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-config-data\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.699663 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.699799 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmhn\" (UniqueName: \"kubernetes.io/projected/cd1a81bc-6c1f-4caa-917a-900c527f0df5-kube-api-access-nlmhn\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.699923 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.699952 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd1a81bc-6c1f-4caa-917a-900c527f0df5-logs\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.802293 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-config-data\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.802401 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.802465 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmhn\" (UniqueName: \"kubernetes.io/projected/cd1a81bc-6c1f-4caa-917a-900c527f0df5-kube-api-access-nlmhn\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.802524 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.802553 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd1a81bc-6c1f-4caa-917a-900c527f0df5-logs\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.803166 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd1a81bc-6c1f-4caa-917a-900c527f0df5-logs\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.807686 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.807713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-config-data\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.811232 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd1a81bc-6c1f-4caa-917a-900c527f0df5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.823573 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmhn\" (UniqueName: \"kubernetes.io/projected/cd1a81bc-6c1f-4caa-917a-900c527f0df5-kube-api-access-nlmhn\") pod \"nova-metadata-0\" (UID: \"cd1a81bc-6c1f-4caa-917a-900c527f0df5\") " pod="openstack/nova-metadata-0" Dec 02 08:07:53 crc kubenswrapper[4691]: I1202 08:07:53.901962 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.442969 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 02 08:07:54 crc kubenswrapper[4691]: W1202 08:07:54.506173 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd1a81bc_6c1f_4caa_917a_900c527f0df5.slice/crio-339e0039dc743185b1a5ef4e42c8670dcd3a545863725b945eb05ed3b00e5283 WatchSource:0}: Error finding container 339e0039dc743185b1a5ef4e42c8670dcd3a545863725b945eb05ed3b00e5283: Status 404 returned error can't find the container with id 339e0039dc743185b1a5ef4e42c8670dcd3a545863725b945eb05ed3b00e5283 Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.569569 4691 generic.go:334] "Generic (PLEG): container finished" podID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerID="7363c888be5122cf7e0ccadcdfef767102b6310a6a92b96c96bcc3f5049ebd44" exitCode=0 Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.613784 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8025b3f-5cba-44b0-9e3e-b965f93104cc" path="/var/lib/kubelet/pods/d8025b3f-5cba-44b0-9e3e-b965f93104cc/volumes" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.614532 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48ca331e-d631-4edd-ad23-1c22786e88fc","Type":"ContainerDied","Data":"7363c888be5122cf7e0ccadcdfef767102b6310a6a92b96c96bcc3f5049ebd44"} Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.639226 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.827268 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-combined-ca-bundle\") pod \"48ca331e-d631-4edd-ad23-1c22786e88fc\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.828262 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48ca331e-d631-4edd-ad23-1c22786e88fc-logs\") pod \"48ca331e-d631-4edd-ad23-1c22786e88fc\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.828351 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-internal-tls-certs\") pod \"48ca331e-d631-4edd-ad23-1c22786e88fc\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.828378 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmjvp\" (UniqueName: \"kubernetes.io/projected/48ca331e-d631-4edd-ad23-1c22786e88fc-kube-api-access-mmjvp\") pod \"48ca331e-d631-4edd-ad23-1c22786e88fc\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.828456 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-config-data\") pod \"48ca331e-d631-4edd-ad23-1c22786e88fc\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.828486 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-public-tls-certs\") pod \"48ca331e-d631-4edd-ad23-1c22786e88fc\" (UID: \"48ca331e-d631-4edd-ad23-1c22786e88fc\") " Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.828825 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48ca331e-d631-4edd-ad23-1c22786e88fc-logs" (OuterVolumeSpecName: "logs") pod "48ca331e-d631-4edd-ad23-1c22786e88fc" (UID: "48ca331e-d631-4edd-ad23-1c22786e88fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.829039 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/48ca331e-d631-4edd-ad23-1c22786e88fc-logs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.833213 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48ca331e-d631-4edd-ad23-1c22786e88fc-kube-api-access-mmjvp" (OuterVolumeSpecName: "kube-api-access-mmjvp") pod "48ca331e-d631-4edd-ad23-1c22786e88fc" (UID: "48ca331e-d631-4edd-ad23-1c22786e88fc"). InnerVolumeSpecName "kube-api-access-mmjvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.859898 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "48ca331e-d631-4edd-ad23-1c22786e88fc" (UID: "48ca331e-d631-4edd-ad23-1c22786e88fc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.861480 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-config-data" (OuterVolumeSpecName: "config-data") pod "48ca331e-d631-4edd-ad23-1c22786e88fc" (UID: "48ca331e-d631-4edd-ad23-1c22786e88fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.885499 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "48ca331e-d631-4edd-ad23-1c22786e88fc" (UID: "48ca331e-d631-4edd-ad23-1c22786e88fc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.888580 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "48ca331e-d631-4edd-ad23-1c22786e88fc" (UID: "48ca331e-d631-4edd-ad23-1c22786e88fc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.931453 4691 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.931490 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmjvp\" (UniqueName: \"kubernetes.io/projected/48ca331e-d631-4edd-ad23-1c22786e88fc-kube-api-access-mmjvp\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.931504 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.931513 4691 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:54 crc kubenswrapper[4691]: I1202 08:07:54.931523 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48ca331e-d631-4edd-ad23-1c22786e88fc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.583967 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.583718 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"48ca331e-d631-4edd-ad23-1c22786e88fc","Type":"ContainerDied","Data":"f47364d533ca8ecc825b1044dac64ccae786daf8d53bde4483f24006673ebd68"} Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.584111 4691 scope.go:117] "RemoveContainer" containerID="7363c888be5122cf7e0ccadcdfef767102b6310a6a92b96c96bcc3f5049ebd44" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.589010 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd1a81bc-6c1f-4caa-917a-900c527f0df5","Type":"ContainerStarted","Data":"39b39cc28bae822d27c2d5fd8b4fc1497fb0e9be1d1823ee4b7c5edf73737425"} Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.589064 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd1a81bc-6c1f-4caa-917a-900c527f0df5","Type":"ContainerStarted","Data":"b7117b18f68eca7bfc092ffbeaa1604266f0bb53d802c6f80b0b20b825ff5fb9"} Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.589077 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"cd1a81bc-6c1f-4caa-917a-900c527f0df5","Type":"ContainerStarted","Data":"339e0039dc743185b1a5ef4e42c8670dcd3a545863725b945eb05ed3b00e5283"} Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.618186 4691 scope.go:117] "RemoveContainer" containerID="ea911e49461a7e2f43ff2653be03ce22c48d21e7227f15c6d8ff229d50445f59" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.626537 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.626515806 podStartE2EDuration="2.626515806s" podCreationTimestamp="2025-12-02 08:07:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:55.616325242 +0000 UTC m=+1323.400404124" watchObservedRunningTime="2025-12-02 08:07:55.626515806 +0000 UTC m=+1323.410594668" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.655474 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.668605 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.679123 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:55 crc kubenswrapper[4691]: E1202 08:07:55.679863 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-api" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.679888 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-api" Dec 02 08:07:55 crc kubenswrapper[4691]: E1202 08:07:55.679908 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-log" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.679917 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-log" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.680150 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-log" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.680175 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" containerName="nova-api-api" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.681607 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.683907 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.684159 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.688602 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.711800 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:55 crc kubenswrapper[4691]: E1202 08:07:55.777906 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48ca331e_d631_4edd_ad23_1c22786e88fc.slice/crio-f47364d533ca8ecc825b1044dac64ccae786daf8d53bde4483f24006673ebd68\": RecentStats: unable to find data in memory cache]" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.849429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.849826 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-public-tls-certs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.849903 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q76w\" (UniqueName: \"kubernetes.io/projected/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-kube-api-access-2q76w\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.849942 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-logs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.849978 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-config-data\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.850103 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.952807 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.952926 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.952986 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-public-tls-certs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.953029 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q76w\" (UniqueName: \"kubernetes.io/projected/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-kube-api-access-2q76w\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.953057 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-logs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.953080 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-config-data\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.953861 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-logs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.959293 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-internal-tls-certs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.959681 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.962351 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-public-tls-certs\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.964448 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-config-data\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:55 crc kubenswrapper[4691]: I1202 08:07:55.971470 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q76w\" (UniqueName: \"kubernetes.io/projected/a12282a4-2fdb-4627-b2ff-06dbde0d2fdb-kube-api-access-2q76w\") pod \"nova-api-0\" (UID: \"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb\") " pod="openstack/nova-api-0" Dec 02 08:07:56 crc kubenswrapper[4691]: I1202 08:07:56.017706 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 02 08:07:56 crc kubenswrapper[4691]: W1202 08:07:56.487254 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda12282a4_2fdb_4627_b2ff_06dbde0d2fdb.slice/crio-a3cf6db32c98206c21a605d49714e5d3982dab2baf0f7b4d20e94f9ea3b00dbf WatchSource:0}: Error finding container a3cf6db32c98206c21a605d49714e5d3982dab2baf0f7b4d20e94f9ea3b00dbf: Status 404 returned error can't find the container with id a3cf6db32c98206c21a605d49714e5d3982dab2baf0f7b4d20e94f9ea3b00dbf Dec 02 08:07:56 crc kubenswrapper[4691]: I1202 08:07:56.488458 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 02 08:07:56 crc kubenswrapper[4691]: I1202 08:07:56.574506 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48ca331e-d631-4edd-ad23-1c22786e88fc" path="/var/lib/kubelet/pods/48ca331e-d631-4edd-ad23-1c22786e88fc/volumes" Dec 02 08:07:56 crc kubenswrapper[4691]: I1202 08:07:56.600514 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb","Type":"ContainerStarted","Data":"a3cf6db32c98206c21a605d49714e5d3982dab2baf0f7b4d20e94f9ea3b00dbf"} Dec 02 08:07:56 crc kubenswrapper[4691]: I1202 08:07:56.849133 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 02 08:07:57 crc kubenswrapper[4691]: I1202 08:07:57.616047 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb","Type":"ContainerStarted","Data":"fcb1894c9df629037c9a0ed743e5b027ef00f4839c59fb1c3f85fef1d7f72a22"} Dec 02 08:07:57 crc kubenswrapper[4691]: I1202 08:07:57.616465 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a12282a4-2fdb-4627-b2ff-06dbde0d2fdb","Type":"ContainerStarted","Data":"f1912a5361fcfeecea1e3a7cf10558fbaa370a462f1ad5d126f1f828c23e549f"} Dec 02 08:07:57 crc kubenswrapper[4691]: I1202 08:07:57.645155 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.645131262 podStartE2EDuration="2.645131262s" podCreationTimestamp="2025-12-02 08:07:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:07:57.636780913 +0000 UTC m=+1325.420859785" watchObservedRunningTime="2025-12-02 08:07:57.645131262 +0000 UTC m=+1325.429210124" Dec 02 08:07:58 crc kubenswrapper[4691]: I1202 08:07:58.902898 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:07:58 crc kubenswrapper[4691]: I1202 08:07:58.903066 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 02 08:08:01 crc kubenswrapper[4691]: I1202 08:08:01.848930 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 02 08:08:01 crc kubenswrapper[4691]: I1202 08:08:01.879498 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 02 08:08:02 crc kubenswrapper[4691]: I1202 08:08:02.688367 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 02 08:08:03 crc kubenswrapper[4691]: I1202 08:08:03.902311 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:08:03 crc kubenswrapper[4691]: I1202 08:08:03.902359 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 02 08:08:04 crc kubenswrapper[4691]: I1202 08:08:04.921071 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cd1a81bc-6c1f-4caa-917a-900c527f0df5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:08:04 crc kubenswrapper[4691]: I1202 08:08:04.921087 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="cd1a81bc-6c1f-4caa-917a-900c527f0df5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:08:05 crc kubenswrapper[4691]: I1202 08:08:05.508801 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="cc105558-7a16-4b17-9a48-28eecf3dd9ed" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.194:3000/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 02 08:08:06 crc kubenswrapper[4691]: I1202 08:08:06.018442 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:08:06 crc kubenswrapper[4691]: I1202 08:08:06.018555 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 02 08:08:06 crc kubenswrapper[4691]: I1202 08:08:06.722278 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 02 08:08:07 crc kubenswrapper[4691]: I1202 08:08:07.031062 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a12282a4-2fdb-4627-b2ff-06dbde0d2fdb" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:08:07 crc kubenswrapper[4691]: I1202 08:08:07.031079 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a12282a4-2fdb-4627-b2ff-06dbde0d2fdb" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 02 08:08:13 crc kubenswrapper[4691]: I1202 08:08:13.909035 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:08:13 crc kubenswrapper[4691]: I1202 08:08:13.909661 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 02 08:08:13 crc kubenswrapper[4691]: I1202 08:08:13.913838 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:08:13 crc kubenswrapper[4691]: I1202 08:08:13.915842 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 02 08:08:16 crc kubenswrapper[4691]: I1202 08:08:16.026547 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:08:16 crc kubenswrapper[4691]: I1202 08:08:16.027408 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:08:16 crc kubenswrapper[4691]: I1202 08:08:16.027491 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 02 08:08:16 crc kubenswrapper[4691]: I1202 08:08:16.037773 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:08:16 crc kubenswrapper[4691]: I1202 08:08:16.818891 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 02 08:08:16 crc kubenswrapper[4691]: I1202 08:08:16.825949 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 02 08:08:25 crc kubenswrapper[4691]: I1202 08:08:25.915042 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:08:26 crc kubenswrapper[4691]: I1202 08:08:26.829515 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:08:30 crc kubenswrapper[4691]: I1202 08:08:30.841149 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="rabbitmq" containerID="cri-o://a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1" gracePeriod=604796 Dec 02 08:08:31 crc kubenswrapper[4691]: I1202 08:08:31.558060 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="rabbitmq" containerID="cri-o://14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899" gracePeriod=604796 Dec 02 08:08:33 crc kubenswrapper[4691]: I1202 08:08:33.819510 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Dec 02 08:08:34 crc kubenswrapper[4691]: I1202 08:08:34.127022 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.410905 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.565520 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrs88\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-kube-api-access-vrs88\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.565598 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.565721 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-confd\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.565820 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-erlang-cookie\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.565865 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-tls\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.565945 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-plugins\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.566042 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ed4ad29-5963-47aa-ba01-faf16686c61d-erlang-cookie-secret\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.566085 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ed4ad29-5963-47aa-ba01-faf16686c61d-pod-info\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.566132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-plugins-conf\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.566164 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-config-data\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.566193 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-server-conf\") pod \"1ed4ad29-5963-47aa-ba01-faf16686c61d\" (UID: \"1ed4ad29-5963-47aa-ba01-faf16686c61d\") " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.567895 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.568043 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.568148 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.575219 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.579720 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/1ed4ad29-5963-47aa-ba01-faf16686c61d-pod-info" (OuterVolumeSpecName: "pod-info") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.580188 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.597605 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-kube-api-access-vrs88" (OuterVolumeSpecName: "kube-api-access-vrs88") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "kube-api-access-vrs88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.611858 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-config-data" (OuterVolumeSpecName: "config-data") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.612864 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ed4ad29-5963-47aa-ba01-faf16686c61d-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.629856 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-server-conf" (OuterVolumeSpecName: "server-conf") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.668917 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.668955 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.668968 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.668982 4691 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/1ed4ad29-5963-47aa-ba01-faf16686c61d-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.668993 4691 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/1ed4ad29-5963-47aa-ba01-faf16686c61d-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.669003 4691 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.669015 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.669025 4691 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/1ed4ad29-5963-47aa-ba01-faf16686c61d-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.669035 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrs88\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-kube-api-access-vrs88\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.669069 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.695629 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.707402 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "1ed4ad29-5963-47aa-ba01-faf16686c61d" (UID: "1ed4ad29-5963-47aa-ba01-faf16686c61d"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.770575 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:37 crc kubenswrapper[4691]: I1202 08:08:37.770608 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/1ed4ad29-5963-47aa-ba01-faf16686c61d-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.021884 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.038393 4691 generic.go:334] "Generic (PLEG): container finished" podID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerID="a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1" exitCode=0 Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.038482 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ed4ad29-5963-47aa-ba01-faf16686c61d","Type":"ContainerDied","Data":"a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1"} Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.038527 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"1ed4ad29-5963-47aa-ba01-faf16686c61d","Type":"ContainerDied","Data":"b4d00eb44634459e3e3dda851a5a0bda76903b9f4b467bef070a6776eb4dafb9"} Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.038550 4691 scope.go:117] "RemoveContainer" containerID="a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.038572 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.054406 4691 generic.go:334] "Generic (PLEG): container finished" podID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerID="14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899" exitCode=0 Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.054447 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ce7acb7-a140-4c78-a71d-d3c96aa12651","Type":"ContainerDied","Data":"14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899"} Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.054475 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8ce7acb7-a140-4c78-a71d-d3c96aa12651","Type":"ContainerDied","Data":"12d830600803b83d095461705a6f4f4eb4cec000d454d8a8d98a9c68d35b9e66"} Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.054540 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082009 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-server-conf\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082208 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-config-data\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082247 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-plugins-conf\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082290 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ce7acb7-a140-4c78-a71d-d3c96aa12651-pod-info\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082342 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-confd\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082386 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-tls\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082449 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ce7acb7-a140-4c78-a71d-d3c96aa12651-erlang-cookie-secret\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082491 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-plugins\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082531 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjv8f\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-kube-api-access-zjv8f\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082576 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-erlang-cookie\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.082672 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\" (UID: \"8ce7acb7-a140-4c78-a71d-d3c96aa12651\") " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.086367 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.086842 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.090068 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.092521 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.093668 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8ce7acb7-a140-4c78-a71d-d3c96aa12651-pod-info" (OuterVolumeSpecName: "pod-info") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.093745 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.104355 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-kube-api-access-zjv8f" (OuterVolumeSpecName: "kube-api-access-zjv8f") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "kube-api-access-zjv8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.105030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ce7acb7-a140-4c78-a71d-d3c96aa12651-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.135849 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-config-data" (OuterVolumeSpecName: "config-data") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.162390 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-server-conf" (OuterVolumeSpecName: "server-conf") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187886 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187920 4691 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-server-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187930 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187937 4691 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8ce7acb7-a140-4c78-a71d-d3c96aa12651-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187946 4691 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8ce7acb7-a140-4c78-a71d-d3c96aa12651-pod-info\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187953 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187961 4691 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8ce7acb7-a140-4c78-a71d-d3c96aa12651-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187970 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187979 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjv8f\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-kube-api-access-zjv8f\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.187987 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.239985 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8ce7acb7-a140-4c78-a71d-d3c96aa12651" (UID: "8ce7acb7-a140-4c78-a71d-d3c96aa12651"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.280138 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.302028 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8ce7acb7-a140-4c78-a71d-d3c96aa12651-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.302067 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.390667 4691 scope.go:117] "RemoveContainer" containerID="f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.448826 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.459470 4691 scope.go:117] "RemoveContainer" containerID="a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.460157 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1\": container with ID starting with a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1 not found: ID does not exist" containerID="a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.460188 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1"} err="failed to get container status \"a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1\": rpc error: code = NotFound desc = could not find container \"a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1\": container with ID starting with a0d70114134586b4a44fded701ede11e7ac146b36153190e936a357357f451d1 not found: ID does not exist" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.460208 4691 scope.go:117] "RemoveContainer" containerID="f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.479172 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925\": container with ID starting with f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925 not found: ID does not exist" containerID="f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.479234 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925"} err="failed to get container status \"f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925\": rpc error: code = NotFound desc = could not find container \"f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925\": container with ID starting with f96f68dca2c2f22fe5a28cb0a5c8accb88b1b3a79f8ea6b787c57a7a5579d925 not found: ID does not exist" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.479270 4691 scope.go:117] "RemoveContainer" containerID="14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.487399 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.497693 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.546961 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.568096 4691 scope.go:117] "RemoveContainer" containerID="03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.590804 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" path="/var/lib/kubelet/pods/1ed4ad29-5963-47aa-ba01-faf16686c61d/volumes" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.597125 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" path="/var/lib/kubelet/pods/8ce7acb7-a140-4c78-a71d-d3c96aa12651/volumes" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.597897 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.598418 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="rabbitmq" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.598455 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="rabbitmq" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.598487 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="setup-container" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.598494 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="setup-container" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.598513 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="setup-container" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.598537 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="setup-container" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.598561 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="rabbitmq" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.598568 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="rabbitmq" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.599391 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ed4ad29-5963-47aa-ba01-faf16686c61d" containerName="rabbitmq" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.599435 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce7acb7-a140-4c78-a71d-d3c96aa12651" containerName="rabbitmq" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.600190 4691 scope.go:117] "RemoveContainer" containerID="14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.600941 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.601249 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899\": container with ID starting with 14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899 not found: ID does not exist" containerID="14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.601309 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899"} err="failed to get container status \"14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899\": rpc error: code = NotFound desc = could not find container \"14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899\": container with ID starting with 14dabd6081bb5abd473a04ac686cbbd2bdacf01fd6132d249f14d660284cb899 not found: ID does not exist" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.601338 4691 scope.go:117] "RemoveContainer" containerID="03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8" Dec 02 08:08:38 crc kubenswrapper[4691]: E1202 08:08:38.604114 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8\": container with ID starting with 03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8 not found: ID does not exist" containerID="03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.604154 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8"} err="failed to get container status \"03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8\": rpc error: code = NotFound desc = could not find container \"03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8\": container with ID starting with 03bad78ac06c30a1d387147bb88237a08418b8554054a53d0bc4d89c9bff2ae8 not found: ID does not exist" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.604736 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.604885 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.605101 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-g4k6l" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.605217 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.606430 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.606628 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.614341 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.621975 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.628378 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.632364 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.632699 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.633001 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.633067 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.633214 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.633306 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-dxjbx" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.634009 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.639344 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.649548 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.728786 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.728827 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.728858 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.728890 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729004 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729026 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0573471b-7d3a-484d-9195-87918928a753-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729066 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729092 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-config-data\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729106 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729165 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0573471b-7d3a-484d-9195-87918928a753-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729183 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpk49\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-kube-api-access-zpk49\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729261 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729339 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729361 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jflr5\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-kube-api-access-jflr5\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729379 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729399 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729482 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729507 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.729529 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.831581 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.831630 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.831661 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jflr5\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-kube-api-access-jflr5\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.831682 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.831705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.831744 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.832093 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.832504 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.832536 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.832612 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.832620 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.832830 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.833198 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.833298 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.833380 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.833591 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.833894 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834031 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834141 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834168 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0573471b-7d3a-484d-9195-87918928a753-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834194 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834260 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-config-data\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834548 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834583 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834637 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834678 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0573471b-7d3a-484d-9195-87918928a753-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834695 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpk49\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-kube-api-access-zpk49\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.834776 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.836065 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-config-data\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.836393 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.836414 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.836447 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.836665 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.838587 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.839541 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.840140 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.840871 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.841511 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.842497 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/0573471b-7d3a-484d-9195-87918928a753-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.843245 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.846720 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/0573471b-7d3a-484d-9195-87918928a753-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.847860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/0573471b-7d3a-484d-9195-87918928a753-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.851603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jflr5\" (UniqueName: \"kubernetes.io/projected/178767a6-fba0-4c85-ab0c-0a3a1ffcc627-kube-api-access-jflr5\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.855553 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpk49\" (UniqueName: \"kubernetes.io/projected/0573471b-7d3a-484d-9195-87918928a753-kube-api-access-zpk49\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.881102 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"0573471b-7d3a-484d-9195-87918928a753\") " pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.890652 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-server-0\" (UID: \"178767a6-fba0-4c85-ab0c-0a3a1ffcc627\") " pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.931446 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 02 08:08:38 crc kubenswrapper[4691]: I1202 08:08:38.948046 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:08:39 crc kubenswrapper[4691]: W1202 08:08:39.482421 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod178767a6_fba0_4c85_ab0c_0a3a1ffcc627.slice/crio-2ad2f025c0888b2ae59b79ad49d983774459bf4c96964cbfc07f928bbad374e9 WatchSource:0}: Error finding container 2ad2f025c0888b2ae59b79ad49d983774459bf4c96964cbfc07f928bbad374e9: Status 404 returned error can't find the container with id 2ad2f025c0888b2ae59b79ad49d983774459bf4c96964cbfc07f928bbad374e9 Dec 02 08:08:39 crc kubenswrapper[4691]: I1202 08:08:39.497956 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 02 08:08:39 crc kubenswrapper[4691]: W1202 08:08:39.509486 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0573471b_7d3a_484d_9195_87918928a753.slice/crio-eba520ae37c4e5c528a46e9168fe8c817a4691400c44b1e078b0e19d211faf0c WatchSource:0}: Error finding container eba520ae37c4e5c528a46e9168fe8c817a4691400c44b1e078b0e19d211faf0c: Status 404 returned error can't find the container with id eba520ae37c4e5c528a46e9168fe8c817a4691400c44b1e078b0e19d211faf0c Dec 02 08:08:39 crc kubenswrapper[4691]: I1202 08:08:39.519675 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 02 08:08:40 crc kubenswrapper[4691]: I1202 08:08:40.134922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"178767a6-fba0-4c85-ab0c-0a3a1ffcc627","Type":"ContainerStarted","Data":"2ad2f025c0888b2ae59b79ad49d983774459bf4c96964cbfc07f928bbad374e9"} Dec 02 08:08:40 crc kubenswrapper[4691]: I1202 08:08:40.137205 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0573471b-7d3a-484d-9195-87918928a753","Type":"ContainerStarted","Data":"eba520ae37c4e5c528a46e9168fe8c817a4691400c44b1e078b0e19d211faf0c"} Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.517917 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-jkt2g"] Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.519872 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.527419 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.531862 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-jkt2g"] Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.607293 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.607533 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.607600 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.608019 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.608075 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.608100 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrqb5\" (UniqueName: \"kubernetes.io/projected/165e1987-19de-4249-9266-c3101c29d221-kube-api-access-xrqb5\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.608239 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-config\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711141 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711275 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711305 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711326 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrqb5\" (UniqueName: \"kubernetes.io/projected/165e1987-19de-4249-9266-c3101c29d221-kube-api-access-xrqb5\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711384 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-config\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711446 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.711492 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.712596 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.712634 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.712664 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.712875 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-config\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.713285 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.713368 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.735571 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrqb5\" (UniqueName: \"kubernetes.io/projected/165e1987-19de-4249-9266-c3101c29d221-kube-api-access-xrqb5\") pod \"dnsmasq-dns-79bd4cc8c9-jkt2g\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:41 crc kubenswrapper[4691]: I1202 08:08:41.840913 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:42 crc kubenswrapper[4691]: I1202 08:08:42.159077 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"178767a6-fba0-4c85-ab0c-0a3a1ffcc627","Type":"ContainerStarted","Data":"2682af3f76acc53e3bf0f7331c27e035e9c168db512ffde1fa96060c15c31005"} Dec 02 08:08:42 crc kubenswrapper[4691]: I1202 08:08:42.161924 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0573471b-7d3a-484d-9195-87918928a753","Type":"ContainerStarted","Data":"2c203516c7dd197930bdadee25354a7a25534fabf253b987d4f2640030e032d7"} Dec 02 08:08:42 crc kubenswrapper[4691]: W1202 08:08:42.423527 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod165e1987_19de_4249_9266_c3101c29d221.slice/crio-20cea622d28e5be93e6947713aa27028a739a4ab21ad9a00b1c9cc943a27dae1 WatchSource:0}: Error finding container 20cea622d28e5be93e6947713aa27028a739a4ab21ad9a00b1c9cc943a27dae1: Status 404 returned error can't find the container with id 20cea622d28e5be93e6947713aa27028a739a4ab21ad9a00b1c9cc943a27dae1 Dec 02 08:08:42 crc kubenswrapper[4691]: I1202 08:08:42.425391 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-jkt2g"] Dec 02 08:08:43 crc kubenswrapper[4691]: I1202 08:08:43.173279 4691 generic.go:334] "Generic (PLEG): container finished" podID="165e1987-19de-4249-9266-c3101c29d221" containerID="5c9bc551385eff83c3efc97dc452687e110ac8ef23b34de5967b019fa745e424" exitCode=0 Dec 02 08:08:43 crc kubenswrapper[4691]: I1202 08:08:43.173321 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" event={"ID":"165e1987-19de-4249-9266-c3101c29d221","Type":"ContainerDied","Data":"5c9bc551385eff83c3efc97dc452687e110ac8ef23b34de5967b019fa745e424"} Dec 02 08:08:43 crc kubenswrapper[4691]: I1202 08:08:43.173635 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" event={"ID":"165e1987-19de-4249-9266-c3101c29d221","Type":"ContainerStarted","Data":"20cea622d28e5be93e6947713aa27028a739a4ab21ad9a00b1c9cc943a27dae1"} Dec 02 08:08:44 crc kubenswrapper[4691]: I1202 08:08:44.187981 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" event={"ID":"165e1987-19de-4249-9266-c3101c29d221","Type":"ContainerStarted","Data":"610f20b45592ee5e4c17f952e9cb29e5fe7b0bd855ee9d3fcd855c00dedb242c"} Dec 02 08:08:44 crc kubenswrapper[4691]: I1202 08:08:44.188930 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:44 crc kubenswrapper[4691]: I1202 08:08:44.218211 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" podStartSLOduration=3.218186723 podStartE2EDuration="3.218186723s" podCreationTimestamp="2025-12-02 08:08:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:08:44.208102041 +0000 UTC m=+1371.992180923" watchObservedRunningTime="2025-12-02 08:08:44.218186723 +0000 UTC m=+1372.002265595" Dec 02 08:08:51 crc kubenswrapper[4691]: I1202 08:08:51.843004 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:08:51 crc kubenswrapper[4691]: I1202 08:08:51.920576 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-97rjx"] Dec 02 08:08:51 crc kubenswrapper[4691]: I1202 08:08:51.920850 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerName="dnsmasq-dns" containerID="cri-o://b08387eed4168dd295210e3e140ea450d30f3d2fa31d69fc3b9bb8982ad8824f" gracePeriod=10 Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.102001 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-g4pgk"] Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.104723 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.111438 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-g4pgk"] Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235350 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-config\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235713 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-dns-svc\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235749 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235820 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235841 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5fxl\" (UniqueName: \"kubernetes.io/projected/fa0cb344-97e1-42ae-867b-30322564459d-kube-api-access-s5fxl\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235860 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.235960 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.266929 4691 generic.go:334] "Generic (PLEG): container finished" podID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerID="b08387eed4168dd295210e3e140ea450d30f3d2fa31d69fc3b9bb8982ad8824f" exitCode=0 Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.266974 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" event={"ID":"d265ec22-2ef2-49d9-99d5-7b3554b6a32f","Type":"ContainerDied","Data":"b08387eed4168dd295210e3e140ea450d30f3d2fa31d69fc3b9bb8982ad8824f"} Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337487 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337648 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-config\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337701 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-dns-svc\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337722 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337745 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337786 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5fxl\" (UniqueName: \"kubernetes.io/projected/fa0cb344-97e1-42ae-867b-30322564459d-kube-api-access-s5fxl\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.337803 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.340666 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-dns-svc\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.340666 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.341213 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-config\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.341388 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.341594 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.341670 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa0cb344-97e1-42ae-867b-30322564459d-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.363084 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5fxl\" (UniqueName: \"kubernetes.io/projected/fa0cb344-97e1-42ae-867b-30322564459d-kube-api-access-s5fxl\") pod \"dnsmasq-dns-55478c4467-g4pgk\" (UID: \"fa0cb344-97e1-42ae-867b-30322564459d\") " pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.441581 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.633514 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.644572 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-swift-storage-0\") pod \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.734616 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d265ec22-2ef2-49d9-99d5-7b3554b6a32f" (UID: "d265ec22-2ef2-49d9-99d5-7b3554b6a32f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.746470 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtqxl\" (UniqueName: \"kubernetes.io/projected/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-kube-api-access-wtqxl\") pod \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.746544 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-nb\") pod \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.746713 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-config\") pod \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.746797 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-svc\") pod \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.746896 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-sb\") pod \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\" (UID: \"d265ec22-2ef2-49d9-99d5-7b3554b6a32f\") " Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.747393 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.764151 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-kube-api-access-wtqxl" (OuterVolumeSpecName: "kube-api-access-wtqxl") pod "d265ec22-2ef2-49d9-99d5-7b3554b6a32f" (UID: "d265ec22-2ef2-49d9-99d5-7b3554b6a32f"). InnerVolumeSpecName "kube-api-access-wtqxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.806141 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d265ec22-2ef2-49d9-99d5-7b3554b6a32f" (UID: "d265ec22-2ef2-49d9-99d5-7b3554b6a32f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.814649 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d265ec22-2ef2-49d9-99d5-7b3554b6a32f" (UID: "d265ec22-2ef2-49d9-99d5-7b3554b6a32f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.816069 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d265ec22-2ef2-49d9-99d5-7b3554b6a32f" (UID: "d265ec22-2ef2-49d9-99d5-7b3554b6a32f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.821781 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-g4pgk"] Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.835129 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-config" (OuterVolumeSpecName: "config") pod "d265ec22-2ef2-49d9-99d5-7b3554b6a32f" (UID: "d265ec22-2ef2-49d9-99d5-7b3554b6a32f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.851852 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.851943 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.853320 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.853336 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtqxl\" (UniqueName: \"kubernetes.io/projected/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-kube-api-access-wtqxl\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:52 crc kubenswrapper[4691]: I1202 08:08:52.853346 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d265ec22-2ef2-49d9-99d5-7b3554b6a32f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.287349 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" event={"ID":"d265ec22-2ef2-49d9-99d5-7b3554b6a32f","Type":"ContainerDied","Data":"ed07e8babc3b6132924bd738ec3ae8c47729cc5b43b9a716f94a79900d89d715"} Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.287683 4691 scope.go:117] "RemoveContainer" containerID="b08387eed4168dd295210e3e140ea450d30f3d2fa31d69fc3b9bb8982ad8824f" Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.287400 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-97rjx" Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.290022 4691 generic.go:334] "Generic (PLEG): container finished" podID="fa0cb344-97e1-42ae-867b-30322564459d" containerID="00ccd37c13fef00fb63b75d840d0893ad4d8f4ca98dfe02fb7e44ef156e7f484" exitCode=0 Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.290055 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" event={"ID":"fa0cb344-97e1-42ae-867b-30322564459d","Type":"ContainerDied","Data":"00ccd37c13fef00fb63b75d840d0893ad4d8f4ca98dfe02fb7e44ef156e7f484"} Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.290086 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" event={"ID":"fa0cb344-97e1-42ae-867b-30322564459d","Type":"ContainerStarted","Data":"1ee67b27b646a0efdac95b7423ab64e5ac0752c7b359639b2f66550aa2c1e3ad"} Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.490458 4691 scope.go:117] "RemoveContainer" containerID="7b55d608ba646059f356b58cf6be84f7d2172e267c04b75c23af0bde1952396b" Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.517381 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-97rjx"] Dec 02 08:08:53 crc kubenswrapper[4691]: I1202 08:08:53.527275 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-97rjx"] Dec 02 08:08:54 crc kubenswrapper[4691]: I1202 08:08:54.302256 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" event={"ID":"fa0cb344-97e1-42ae-867b-30322564459d","Type":"ContainerStarted","Data":"c943ced5811becfe747f7e5249212cb2318afa0db13098557845abc70de4d22a"} Dec 02 08:08:54 crc kubenswrapper[4691]: I1202 08:08:54.302400 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:08:54 crc kubenswrapper[4691]: I1202 08:08:54.574315 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" path="/var/lib/kubelet/pods/d265ec22-2ef2-49d9-99d5-7b3554b6a32f/volumes" Dec 02 08:09:02 crc kubenswrapper[4691]: I1202 08:09:02.443966 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" Dec 02 08:09:02 crc kubenswrapper[4691]: I1202 08:09:02.476941 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-g4pgk" podStartSLOduration=10.476911643 podStartE2EDuration="10.476911643s" podCreationTimestamp="2025-12-02 08:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:08:54.330081213 +0000 UTC m=+1382.114160085" watchObservedRunningTime="2025-12-02 08:09:02.476911643 +0000 UTC m=+1390.260990515" Dec 02 08:09:02 crc kubenswrapper[4691]: I1202 08:09:02.522254 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-jkt2g"] Dec 02 08:09:02 crc kubenswrapper[4691]: I1202 08:09:02.522586 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" podUID="165e1987-19de-4249-9266-c3101c29d221" containerName="dnsmasq-dns" containerID="cri-o://610f20b45592ee5e4c17f952e9cb29e5fe7b0bd855ee9d3fcd855c00dedb242c" gracePeriod=10 Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.441344 4691 generic.go:334] "Generic (PLEG): container finished" podID="165e1987-19de-4249-9266-c3101c29d221" containerID="610f20b45592ee5e4c17f952e9cb29e5fe7b0bd855ee9d3fcd855c00dedb242c" exitCode=0 Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.441610 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" event={"ID":"165e1987-19de-4249-9266-c3101c29d221","Type":"ContainerDied","Data":"610f20b45592ee5e4c17f952e9cb29e5fe7b0bd855ee9d3fcd855c00dedb242c"} Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.697481 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.750609 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrqb5\" (UniqueName: \"kubernetes.io/projected/165e1987-19de-4249-9266-c3101c29d221-kube-api-access-xrqb5\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.750726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-swift-storage-0\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.750860 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-nb\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.750893 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-sb\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.750925 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-openstack-edpm-ipam\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.750953 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-config\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.751052 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-svc\") pod \"165e1987-19de-4249-9266-c3101c29d221\" (UID: \"165e1987-19de-4249-9266-c3101c29d221\") " Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.760265 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/165e1987-19de-4249-9266-c3101c29d221-kube-api-access-xrqb5" (OuterVolumeSpecName: "kube-api-access-xrqb5") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "kube-api-access-xrqb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.812892 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.814909 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.815810 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.816034 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.822337 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-config" (OuterVolumeSpecName: "config") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.849713 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "165e1987-19de-4249-9266-c3101c29d221" (UID: "165e1987-19de-4249-9266-c3101c29d221"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853591 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrqb5\" (UniqueName: \"kubernetes.io/projected/165e1987-19de-4249-9266-c3101c29d221-kube-api-access-xrqb5\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853622 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853660 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853672 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853680 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853689 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:03 crc kubenswrapper[4691]: I1202 08:09:03.853699 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/165e1987-19de-4249-9266-c3101c29d221-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.454592 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" event={"ID":"165e1987-19de-4249-9266-c3101c29d221","Type":"ContainerDied","Data":"20cea622d28e5be93e6947713aa27028a739a4ab21ad9a00b1c9cc943a27dae1"} Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.454656 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-jkt2g" Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.454678 4691 scope.go:117] "RemoveContainer" containerID="610f20b45592ee5e4c17f952e9cb29e5fe7b0bd855ee9d3fcd855c00dedb242c" Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.499365 4691 scope.go:117] "RemoveContainer" containerID="5c9bc551385eff83c3efc97dc452687e110ac8ef23b34de5967b019fa745e424" Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.500347 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-jkt2g"] Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.515311 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-jkt2g"] Dec 02 08:09:04 crc kubenswrapper[4691]: I1202 08:09:04.576029 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="165e1987-19de-4249-9266-c3101c29d221" path="/var/lib/kubelet/pods/165e1987-19de-4249-9266-c3101c29d221/volumes" Dec 02 08:09:13 crc kubenswrapper[4691]: I1202 08:09:13.533632 4691 generic.go:334] "Generic (PLEG): container finished" podID="0573471b-7d3a-484d-9195-87918928a753" containerID="2c203516c7dd197930bdadee25354a7a25534fabf253b987d4f2640030e032d7" exitCode=0 Dec 02 08:09:13 crc kubenswrapper[4691]: I1202 08:09:13.533991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0573471b-7d3a-484d-9195-87918928a753","Type":"ContainerDied","Data":"2c203516c7dd197930bdadee25354a7a25534fabf253b987d4f2640030e032d7"} Dec 02 08:09:14 crc kubenswrapper[4691]: I1202 08:09:14.545589 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"0573471b-7d3a-484d-9195-87918928a753","Type":"ContainerStarted","Data":"3e0d1abcfa6ae99a5cb00801d17d5abf77c8ea99792da1350e347f3f15c0ac50"} Dec 02 08:09:14 crc kubenswrapper[4691]: I1202 08:09:14.546086 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:09:14 crc kubenswrapper[4691]: I1202 08:09:14.547881 4691 generic.go:334] "Generic (PLEG): container finished" podID="178767a6-fba0-4c85-ab0c-0a3a1ffcc627" containerID="2682af3f76acc53e3bf0f7331c27e035e9c168db512ffde1fa96060c15c31005" exitCode=0 Dec 02 08:09:14 crc kubenswrapper[4691]: I1202 08:09:14.547934 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"178767a6-fba0-4c85-ab0c-0a3a1ffcc627","Type":"ContainerDied","Data":"2682af3f76acc53e3bf0f7331c27e035e9c168db512ffde1fa96060c15c31005"} Dec 02 08:09:14 crc kubenswrapper[4691]: I1202 08:09:14.598395 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.598375625 podStartE2EDuration="36.598375625s" podCreationTimestamp="2025-12-02 08:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:09:14.577808013 +0000 UTC m=+1402.361886895" watchObservedRunningTime="2025-12-02 08:09:14.598375625 +0000 UTC m=+1402.382454497" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.537180 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr"] Dec 02 08:09:15 crc kubenswrapper[4691]: E1202 08:09:15.537931 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerName="dnsmasq-dns" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.537950 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerName="dnsmasq-dns" Dec 02 08:09:15 crc kubenswrapper[4691]: E1202 08:09:15.537962 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerName="init" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.537969 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerName="init" Dec 02 08:09:15 crc kubenswrapper[4691]: E1202 08:09:15.537990 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165e1987-19de-4249-9266-c3101c29d221" containerName="dnsmasq-dns" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.537996 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="165e1987-19de-4249-9266-c3101c29d221" containerName="dnsmasq-dns" Dec 02 08:09:15 crc kubenswrapper[4691]: E1202 08:09:15.538032 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="165e1987-19de-4249-9266-c3101c29d221" containerName="init" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.538038 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="165e1987-19de-4249-9266-c3101c29d221" containerName="init" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.538236 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d265ec22-2ef2-49d9-99d5-7b3554b6a32f" containerName="dnsmasq-dns" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.538269 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="165e1987-19de-4249-9266-c3101c29d221" containerName="dnsmasq-dns" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.538990 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.549789 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.550074 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.550224 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.550712 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.567271 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr"] Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.580347 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"178767a6-fba0-4c85-ab0c-0a3a1ffcc627","Type":"ContainerStarted","Data":"e4b9bb71f895d0ab17da7b349918ccb4def9629ff6993672988e4283a6acce58"} Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.580614 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.615519 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.615492257 podStartE2EDuration="37.615492257s" podCreationTimestamp="2025-12-02 08:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:09:15.604862162 +0000 UTC m=+1403.388941024" watchObservedRunningTime="2025-12-02 08:09:15.615492257 +0000 UTC m=+1403.399571119" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.634369 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls5dd\" (UniqueName: \"kubernetes.io/projected/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-kube-api-access-ls5dd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.634553 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.634590 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.634674 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.737134 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.737333 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls5dd\" (UniqueName: \"kubernetes.io/projected/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-kube-api-access-ls5dd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.737376 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.737399 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.743710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.749358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.749543 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.756187 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls5dd\" (UniqueName: \"kubernetes.io/projected/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-kube-api-access-ls5dd\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:15 crc kubenswrapper[4691]: I1202 08:09:15.863292 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:16 crc kubenswrapper[4691]: I1202 08:09:16.514767 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr"] Dec 02 08:09:16 crc kubenswrapper[4691]: I1202 08:09:16.595383 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" event={"ID":"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1","Type":"ContainerStarted","Data":"ab93a4d1e85ba3e51796c13530741d97ac0397e8352b7b3b37e2f35c741d4494"} Dec 02 08:09:21 crc kubenswrapper[4691]: I1202 08:09:21.899291 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:09:21 crc kubenswrapper[4691]: I1202 08:09:21.899869 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:09:28 crc kubenswrapper[4691]: I1202 08:09:28.726244 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" event={"ID":"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1","Type":"ContainerStarted","Data":"854e12725d9c26389928445473975dbb8f2c7797778ae5f6c564d2d9c755bd2b"} Dec 02 08:09:28 crc kubenswrapper[4691]: I1202 08:09:28.755978 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" podStartSLOduration=1.9241956789999999 podStartE2EDuration="13.75595017s" podCreationTimestamp="2025-12-02 08:09:15 +0000 UTC" firstStartedPulling="2025-12-02 08:09:16.549868358 +0000 UTC m=+1404.333947210" lastFinishedPulling="2025-12-02 08:09:28.381622849 +0000 UTC m=+1416.165701701" observedRunningTime="2025-12-02 08:09:28.744504384 +0000 UTC m=+1416.528583256" watchObservedRunningTime="2025-12-02 08:09:28.75595017 +0000 UTC m=+1416.540029032" Dec 02 08:09:28 crc kubenswrapper[4691]: I1202 08:09:28.935966 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 02 08:09:28 crc kubenswrapper[4691]: I1202 08:09:28.952319 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 02 08:09:40 crc kubenswrapper[4691]: I1202 08:09:40.835941 4691 generic.go:334] "Generic (PLEG): container finished" podID="2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" containerID="854e12725d9c26389928445473975dbb8f2c7797778ae5f6c564d2d9c755bd2b" exitCode=0 Dec 02 08:09:40 crc kubenswrapper[4691]: I1202 08:09:40.836070 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" event={"ID":"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1","Type":"ContainerDied","Data":"854e12725d9c26389928445473975dbb8f2c7797778ae5f6c564d2d9c755bd2b"} Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.318177 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.478112 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-inventory\") pod \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.478341 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-ssh-key\") pod \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.478643 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls5dd\" (UniqueName: \"kubernetes.io/projected/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-kube-api-access-ls5dd\") pod \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.478715 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-repo-setup-combined-ca-bundle\") pod \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\" (UID: \"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1\") " Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.532109 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-kube-api-access-ls5dd" (OuterVolumeSpecName: "kube-api-access-ls5dd") pod "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" (UID: "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1"). InnerVolumeSpecName "kube-api-access-ls5dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.534036 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" (UID: "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.600430 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls5dd\" (UniqueName: \"kubernetes.io/projected/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-kube-api-access-ls5dd\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.600473 4691 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.617983 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-inventory" (OuterVolumeSpecName: "inventory") pod "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" (UID: "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.680107 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" (UID: "2a8c0a05-f1a6-4a5e-9598-9146f0074dc1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.703054 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.703095 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/2a8c0a05-f1a6-4a5e-9598-9146f0074dc1-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.856245 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" event={"ID":"2a8c0a05-f1a6-4a5e-9598-9146f0074dc1","Type":"ContainerDied","Data":"ab93a4d1e85ba3e51796c13530741d97ac0397e8352b7b3b37e2f35c741d4494"} Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.856289 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.856294 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab93a4d1e85ba3e51796c13530741d97ac0397e8352b7b3b37e2f35c741d4494" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.953066 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj"] Dec 02 08:09:42 crc kubenswrapper[4691]: E1202 08:09:42.953870 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.953891 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.954102 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a8c0a05-f1a6-4a5e-9598-9146f0074dc1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.954884 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.958199 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.958374 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.959794 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.960378 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:09:42 crc kubenswrapper[4691]: I1202 08:09:42.970113 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj"] Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.110684 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.110794 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzwhp\" (UniqueName: \"kubernetes.io/projected/34cc12b6-9f55-450f-b073-0e89d0889946-kube-api-access-kzwhp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.110893 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.213952 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzwhp\" (UniqueName: \"kubernetes.io/projected/34cc12b6-9f55-450f-b073-0e89d0889946-kube-api-access-kzwhp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.214543 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.215174 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.223280 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.231495 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.236584 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzwhp\" (UniqueName: \"kubernetes.io/projected/34cc12b6-9f55-450f-b073-0e89d0889946-kube-api-access-kzwhp\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-c7mwj\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.274814 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.840567 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj"] Dec 02 08:09:43 crc kubenswrapper[4691]: I1202 08:09:43.871298 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" event={"ID":"34cc12b6-9f55-450f-b073-0e89d0889946","Type":"ContainerStarted","Data":"d15448b7d5943f2f810dc055031b51c23bd59b027643204e1dbf6dbc4310b764"} Dec 02 08:09:44 crc kubenswrapper[4691]: I1202 08:09:44.882295 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" event={"ID":"34cc12b6-9f55-450f-b073-0e89d0889946","Type":"ContainerStarted","Data":"4c0dd6e902ac1ee6473c08f59f395cb5bdcf546e46f43210b152d6f2c94ae792"} Dec 02 08:09:44 crc kubenswrapper[4691]: I1202 08:09:44.903886 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" podStartSLOduration=2.45956407 podStartE2EDuration="2.903863825s" podCreationTimestamp="2025-12-02 08:09:42 +0000 UTC" firstStartedPulling="2025-12-02 08:09:43.856941858 +0000 UTC m=+1431.641020730" lastFinishedPulling="2025-12-02 08:09:44.301241623 +0000 UTC m=+1432.085320485" observedRunningTime="2025-12-02 08:09:44.900926911 +0000 UTC m=+1432.685005783" watchObservedRunningTime="2025-12-02 08:09:44.903863825 +0000 UTC m=+1432.687942687" Dec 02 08:09:49 crc kubenswrapper[4691]: I1202 08:09:49.316705 4691 generic.go:334] "Generic (PLEG): container finished" podID="34cc12b6-9f55-450f-b073-0e89d0889946" containerID="4c0dd6e902ac1ee6473c08f59f395cb5bdcf546e46f43210b152d6f2c94ae792" exitCode=0 Dec 02 08:09:49 crc kubenswrapper[4691]: I1202 08:09:49.316791 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" event={"ID":"34cc12b6-9f55-450f-b073-0e89d0889946","Type":"ContainerDied","Data":"4c0dd6e902ac1ee6473c08f59f395cb5bdcf546e46f43210b152d6f2c94ae792"} Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.728357 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.842480 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-inventory\") pod \"34cc12b6-9f55-450f-b073-0e89d0889946\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.842638 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kzwhp\" (UniqueName: \"kubernetes.io/projected/34cc12b6-9f55-450f-b073-0e89d0889946-kube-api-access-kzwhp\") pod \"34cc12b6-9f55-450f-b073-0e89d0889946\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.842751 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-ssh-key\") pod \"34cc12b6-9f55-450f-b073-0e89d0889946\" (UID: \"34cc12b6-9f55-450f-b073-0e89d0889946\") " Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.848524 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34cc12b6-9f55-450f-b073-0e89d0889946-kube-api-access-kzwhp" (OuterVolumeSpecName: "kube-api-access-kzwhp") pod "34cc12b6-9f55-450f-b073-0e89d0889946" (UID: "34cc12b6-9f55-450f-b073-0e89d0889946"). InnerVolumeSpecName "kube-api-access-kzwhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.872957 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "34cc12b6-9f55-450f-b073-0e89d0889946" (UID: "34cc12b6-9f55-450f-b073-0e89d0889946"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.897101 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-inventory" (OuterVolumeSpecName: "inventory") pod "34cc12b6-9f55-450f-b073-0e89d0889946" (UID: "34cc12b6-9f55-450f-b073-0e89d0889946"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.944976 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kzwhp\" (UniqueName: \"kubernetes.io/projected/34cc12b6-9f55-450f-b073-0e89d0889946-kube-api-access-kzwhp\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.945006 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:50.945016 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/34cc12b6-9f55-450f-b073-0e89d0889946-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.338166 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" event={"ID":"34cc12b6-9f55-450f-b073-0e89d0889946","Type":"ContainerDied","Data":"d15448b7d5943f2f810dc055031b51c23bd59b027643204e1dbf6dbc4310b764"} Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.338210 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d15448b7d5943f2f810dc055031b51c23bd59b027643204e1dbf6dbc4310b764" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.338236 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-c7mwj" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.456649 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572"] Dec 02 08:09:51 crc kubenswrapper[4691]: E1202 08:09:51.457222 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34cc12b6-9f55-450f-b073-0e89d0889946" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.457251 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cc12b6-9f55-450f-b073-0e89d0889946" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.457495 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="34cc12b6-9f55-450f-b073-0e89d0889946" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.458432 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.461069 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.461431 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.461813 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.467639 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.470847 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572"] Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.610386 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlbv5\" (UniqueName: \"kubernetes.io/projected/6eabed67-587a-402c-8d6f-02163a229356-kube-api-access-hlbv5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.610457 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.610561 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.610586 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.713260 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlbv5\" (UniqueName: \"kubernetes.io/projected/6eabed67-587a-402c-8d6f-02163a229356-kube-api-access-hlbv5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.713577 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.713806 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.713952 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.720034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.720524 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.730603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.730876 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlbv5\" (UniqueName: \"kubernetes.io/projected/6eabed67-587a-402c-8d6f-02163a229356-kube-api-access-hlbv5\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ws572\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.785351 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.898567 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:09:51 crc kubenswrapper[4691]: I1202 08:09:51.898637 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:09:52 crc kubenswrapper[4691]: I1202 08:09:52.317662 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572"] Dec 02 08:09:52 crc kubenswrapper[4691]: I1202 08:09:52.350161 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" event={"ID":"6eabed67-587a-402c-8d6f-02163a229356","Type":"ContainerStarted","Data":"24012a4ba0f7b0e45f4f4d06b0f1be8a76eb2ee09faf31c489e4a9f954ab3332"} Dec 02 08:09:52 crc kubenswrapper[4691]: I1202 08:09:52.905031 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:09:53 crc kubenswrapper[4691]: I1202 08:09:53.360402 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" event={"ID":"6eabed67-587a-402c-8d6f-02163a229356","Type":"ContainerStarted","Data":"a6fdba709896d1a466248b6adb44a72ace62723b378d200aca9e84f3562b5799"} Dec 02 08:10:21 crc kubenswrapper[4691]: I1202 08:10:21.898835 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:10:21 crc kubenswrapper[4691]: I1202 08:10:21.899638 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:10:21 crc kubenswrapper[4691]: I1202 08:10:21.899689 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:10:21 crc kubenswrapper[4691]: I1202 08:10:21.900568 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35bf2b176e04ba95989431e7a1c5a8ad045d68a6d864e710ee0e03b73b56f536"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:10:21 crc kubenswrapper[4691]: I1202 08:10:21.900618 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://35bf2b176e04ba95989431e7a1c5a8ad045d68a6d864e710ee0e03b73b56f536" gracePeriod=600 Dec 02 08:10:22 crc kubenswrapper[4691]: I1202 08:10:22.676571 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="35bf2b176e04ba95989431e7a1c5a8ad045d68a6d864e710ee0e03b73b56f536" exitCode=0 Dec 02 08:10:22 crc kubenswrapper[4691]: I1202 08:10:22.676656 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"35bf2b176e04ba95989431e7a1c5a8ad045d68a6d864e710ee0e03b73b56f536"} Dec 02 08:10:22 crc kubenswrapper[4691]: I1202 08:10:22.677214 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4"} Dec 02 08:10:22 crc kubenswrapper[4691]: I1202 08:10:22.677245 4691 scope.go:117] "RemoveContainer" containerID="3c6622064f9b8ca4e8932f776c16ae5af9973dd396f94dcf631a7aa1f00aa037" Dec 02 08:10:22 crc kubenswrapper[4691]: I1202 08:10:22.700831 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" podStartSLOduration=31.121428237 podStartE2EDuration="31.700809009s" podCreationTimestamp="2025-12-02 08:09:51 +0000 UTC" firstStartedPulling="2025-12-02 08:09:52.322734549 +0000 UTC m=+1440.106813411" lastFinishedPulling="2025-12-02 08:09:52.902115321 +0000 UTC m=+1440.686194183" observedRunningTime="2025-12-02 08:09:53.379326066 +0000 UTC m=+1441.163404948" watchObservedRunningTime="2025-12-02 08:10:22.700809009 +0000 UTC m=+1470.484887871" Dec 02 08:10:28 crc kubenswrapper[4691]: I1202 08:10:28.386199 4691 scope.go:117] "RemoveContainer" containerID="4cbb75644dab97ed0e058501793455d3498cbeb17704bdf1e3d4fe2ce8895111" Dec 02 08:10:28 crc kubenswrapper[4691]: I1202 08:10:28.547108 4691 scope.go:117] "RemoveContainer" containerID="e229829d9ac58b537f184ac5848a971c75f0301410a22ce13a9ecbf22402bba2" Dec 02 08:10:52 crc kubenswrapper[4691]: I1202 08:10:52.256902 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-76d578d5f5-hmcbw" podUID="de7d695d-6d9a-4de2-830e-579f9d496f08" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.212089 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r9xjd"] Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.215688 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.283295 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9xjd"] Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.415551 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gbqn\" (UniqueName: \"kubernetes.io/projected/cba59e59-e22a-490c-b63e-47cf86b29185-kube-api-access-7gbqn\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.415626 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-utilities\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.415790 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-catalog-content\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.518407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gbqn\" (UniqueName: \"kubernetes.io/projected/cba59e59-e22a-490c-b63e-47cf86b29185-kube-api-access-7gbqn\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.518803 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-utilities\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.518867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-catalog-content\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.519582 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-catalog-content\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.520217 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-utilities\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.547174 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gbqn\" (UniqueName: \"kubernetes.io/projected/cba59e59-e22a-490c-b63e-47cf86b29185-kube-api-access-7gbqn\") pod \"community-operators-r9xjd\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:57 crc kubenswrapper[4691]: I1202 08:10:57.839006 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:10:58 crc kubenswrapper[4691]: I1202 08:10:58.360257 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r9xjd"] Dec 02 08:10:59 crc kubenswrapper[4691]: I1202 08:10:59.016293 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba59e59-e22a-490c-b63e-47cf86b29185" containerID="820f0789d0c8c3e21b3c469f048133d80fde3d4b89f373d1f97bdba7e6317776" exitCode=0 Dec 02 08:10:59 crc kubenswrapper[4691]: I1202 08:10:59.016491 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerDied","Data":"820f0789d0c8c3e21b3c469f048133d80fde3d4b89f373d1f97bdba7e6317776"} Dec 02 08:10:59 crc kubenswrapper[4691]: I1202 08:10:59.016634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerStarted","Data":"072311f5ccb4e4a5c1d46142e236ffbff65ca0a1d94c6f98f10a6b39c795092c"} Dec 02 08:11:00 crc kubenswrapper[4691]: I1202 08:11:00.030856 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerStarted","Data":"d2ce081085f0dd3dd025e5e2d388c79c53aac6612b72cb2607cf5ab10a1ea497"} Dec 02 08:11:01 crc kubenswrapper[4691]: I1202 08:11:01.046524 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba59e59-e22a-490c-b63e-47cf86b29185" containerID="d2ce081085f0dd3dd025e5e2d388c79c53aac6612b72cb2607cf5ab10a1ea497" exitCode=0 Dec 02 08:11:01 crc kubenswrapper[4691]: I1202 08:11:01.046607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerDied","Data":"d2ce081085f0dd3dd025e5e2d388c79c53aac6612b72cb2607cf5ab10a1ea497"} Dec 02 08:11:02 crc kubenswrapper[4691]: I1202 08:11:02.060120 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerStarted","Data":"e91837f1c687c3b0c3cbe7f45212f7fcf43b1a2f8ddaebfab20e832e92599b61"} Dec 02 08:11:02 crc kubenswrapper[4691]: I1202 08:11:02.083713 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r9xjd" podStartSLOduration=2.6602439970000002 podStartE2EDuration="5.083689975s" podCreationTimestamp="2025-12-02 08:10:57 +0000 UTC" firstStartedPulling="2025-12-02 08:10:59.019446565 +0000 UTC m=+1506.803525427" lastFinishedPulling="2025-12-02 08:11:01.442892543 +0000 UTC m=+1509.226971405" observedRunningTime="2025-12-02 08:11:02.079242994 +0000 UTC m=+1509.863321866" watchObservedRunningTime="2025-12-02 08:11:02.083689975 +0000 UTC m=+1509.867768837" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.461271 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pz2l2"] Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.464729 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.474983 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz2l2"] Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.501018 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs88k\" (UniqueName: \"kubernetes.io/projected/f259e109-20ec-4931-846e-785fe3fc4bb3-kube-api-access-gs88k\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.501125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-catalog-content\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.501181 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-utilities\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.603085 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-catalog-content\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.603170 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-utilities\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.603298 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs88k\" (UniqueName: \"kubernetes.io/projected/f259e109-20ec-4931-846e-785fe3fc4bb3-kube-api-access-gs88k\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.603824 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-utilities\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.604963 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-catalog-content\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.633703 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs88k\" (UniqueName: \"kubernetes.io/projected/f259e109-20ec-4931-846e-785fe3fc4bb3-kube-api-access-gs88k\") pod \"redhat-marketplace-pz2l2\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:06 crc kubenswrapper[4691]: I1202 08:11:06.785582 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:07 crc kubenswrapper[4691]: I1202 08:11:07.290170 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz2l2"] Dec 02 08:11:07 crc kubenswrapper[4691]: I1202 08:11:07.840019 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:11:07 crc kubenswrapper[4691]: I1202 08:11:07.840402 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:11:07 crc kubenswrapper[4691]: I1202 08:11:07.895247 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:11:08 crc kubenswrapper[4691]: I1202 08:11:08.121404 4691 generic.go:334] "Generic (PLEG): container finished" podID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerID="c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8" exitCode=0 Dec 02 08:11:08 crc kubenswrapper[4691]: I1202 08:11:08.121510 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerDied","Data":"c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8"} Dec 02 08:11:08 crc kubenswrapper[4691]: I1202 08:11:08.121588 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerStarted","Data":"864cf9b4a95f711ffc7a018bde7f2f38551ec65a26f165087c453f1ab67a072a"} Dec 02 08:11:08 crc kubenswrapper[4691]: I1202 08:11:08.176305 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:11:09 crc kubenswrapper[4691]: I1202 08:11:09.132304 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerStarted","Data":"299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885"} Dec 02 08:11:10 crc kubenswrapper[4691]: I1202 08:11:10.144670 4691 generic.go:334] "Generic (PLEG): container finished" podID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerID="299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885" exitCode=0 Dec 02 08:11:10 crc kubenswrapper[4691]: I1202 08:11:10.144738 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerDied","Data":"299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885"} Dec 02 08:11:10 crc kubenswrapper[4691]: I1202 08:11:10.238507 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9xjd"] Dec 02 08:11:10 crc kubenswrapper[4691]: I1202 08:11:10.239129 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r9xjd" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="registry-server" containerID="cri-o://e91837f1c687c3b0c3cbe7f45212f7fcf43b1a2f8ddaebfab20e832e92599b61" gracePeriod=2 Dec 02 08:11:10 crc kubenswrapper[4691]: E1202 08:11:10.823740 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcba59e59_e22a_490c_b63e_47cf86b29185.slice/crio-conmon-e91837f1c687c3b0c3cbe7f45212f7fcf43b1a2f8ddaebfab20e832e92599b61.scope\": RecentStats: unable to find data in memory cache]" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.168163 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerStarted","Data":"79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887"} Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.173103 4691 generic.go:334] "Generic (PLEG): container finished" podID="cba59e59-e22a-490c-b63e-47cf86b29185" containerID="e91837f1c687c3b0c3cbe7f45212f7fcf43b1a2f8ddaebfab20e832e92599b61" exitCode=0 Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.173151 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerDied","Data":"e91837f1c687c3b0c3cbe7f45212f7fcf43b1a2f8ddaebfab20e832e92599b61"} Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.188137 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pz2l2" podStartSLOduration=2.482490883 podStartE2EDuration="5.188117234s" podCreationTimestamp="2025-12-02 08:11:06 +0000 UTC" firstStartedPulling="2025-12-02 08:11:08.123632018 +0000 UTC m=+1515.907710880" lastFinishedPulling="2025-12-02 08:11:10.829258369 +0000 UTC m=+1518.613337231" observedRunningTime="2025-12-02 08:11:11.183779316 +0000 UTC m=+1518.967858178" watchObservedRunningTime="2025-12-02 08:11:11.188117234 +0000 UTC m=+1518.972196096" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.239541 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.400674 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-utilities\") pod \"cba59e59-e22a-490c-b63e-47cf86b29185\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.400935 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gbqn\" (UniqueName: \"kubernetes.io/projected/cba59e59-e22a-490c-b63e-47cf86b29185-kube-api-access-7gbqn\") pod \"cba59e59-e22a-490c-b63e-47cf86b29185\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.400974 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-catalog-content\") pod \"cba59e59-e22a-490c-b63e-47cf86b29185\" (UID: \"cba59e59-e22a-490c-b63e-47cf86b29185\") " Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.402187 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-utilities" (OuterVolumeSpecName: "utilities") pod "cba59e59-e22a-490c-b63e-47cf86b29185" (UID: "cba59e59-e22a-490c-b63e-47cf86b29185"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.406385 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba59e59-e22a-490c-b63e-47cf86b29185-kube-api-access-7gbqn" (OuterVolumeSpecName: "kube-api-access-7gbqn") pod "cba59e59-e22a-490c-b63e-47cf86b29185" (UID: "cba59e59-e22a-490c-b63e-47cf86b29185"). InnerVolumeSpecName "kube-api-access-7gbqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.463153 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cba59e59-e22a-490c-b63e-47cf86b29185" (UID: "cba59e59-e22a-490c-b63e-47cf86b29185"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.503352 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.503397 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gbqn\" (UniqueName: \"kubernetes.io/projected/cba59e59-e22a-490c-b63e-47cf86b29185-kube-api-access-7gbqn\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:11 crc kubenswrapper[4691]: I1202 08:11:11.503421 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cba59e59-e22a-490c-b63e-47cf86b29185-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.185012 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r9xjd" event={"ID":"cba59e59-e22a-490c-b63e-47cf86b29185","Type":"ContainerDied","Data":"072311f5ccb4e4a5c1d46142e236ffbff65ca0a1d94c6f98f10a6b39c795092c"} Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.185390 4691 scope.go:117] "RemoveContainer" containerID="e91837f1c687c3b0c3cbe7f45212f7fcf43b1a2f8ddaebfab20e832e92599b61" Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.185062 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r9xjd" Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.210198 4691 scope.go:117] "RemoveContainer" containerID="d2ce081085f0dd3dd025e5e2d388c79c53aac6612b72cb2607cf5ab10a1ea497" Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.229271 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r9xjd"] Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.243845 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r9xjd"] Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.265107 4691 scope.go:117] "RemoveContainer" containerID="820f0789d0c8c3e21b3c469f048133d80fde3d4b89f373d1f97bdba7e6317776" Dec 02 08:11:12 crc kubenswrapper[4691]: I1202 08:11:12.576508 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" path="/var/lib/kubelet/pods/cba59e59-e22a-490c-b63e-47cf86b29185/volumes" Dec 02 08:11:16 crc kubenswrapper[4691]: I1202 08:11:16.786698 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:16 crc kubenswrapper[4691]: I1202 08:11:16.787319 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:16 crc kubenswrapper[4691]: I1202 08:11:16.844323 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:17 crc kubenswrapper[4691]: I1202 08:11:17.289442 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:17 crc kubenswrapper[4691]: I1202 08:11:17.345403 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz2l2"] Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.292313 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pz2l2" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="registry-server" containerID="cri-o://79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887" gracePeriod=2 Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.495227 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2fdtt"] Dec 02 08:11:19 crc kubenswrapper[4691]: E1202 08:11:19.496379 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="registry-server" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.496435 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="registry-server" Dec 02 08:11:19 crc kubenswrapper[4691]: E1202 08:11:19.496459 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="extract-content" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.496466 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="extract-content" Dec 02 08:11:19 crc kubenswrapper[4691]: E1202 08:11:19.496568 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="extract-utilities" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.496580 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="extract-utilities" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.497301 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba59e59-e22a-490c-b63e-47cf86b29185" containerName="registry-server" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.500824 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.507563 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2fdtt"] Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.622146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8trzq\" (UniqueName: \"kubernetes.io/projected/b16b1cf5-2216-4371-891f-385f78b00021-kube-api-access-8trzq\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.622263 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-catalog-content\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.626738 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-utilities\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.728436 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8trzq\" (UniqueName: \"kubernetes.io/projected/b16b1cf5-2216-4371-891f-385f78b00021-kube-api-access-8trzq\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.728776 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-catalog-content\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.728997 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-utilities\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.729617 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-utilities\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.730260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-catalog-content\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.749611 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8trzq\" (UniqueName: \"kubernetes.io/projected/b16b1cf5-2216-4371-891f-385f78b00021-kube-api-access-8trzq\") pod \"redhat-operators-2fdtt\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:19 crc kubenswrapper[4691]: I1202 08:11:19.859041 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.013079 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.139275 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-utilities\") pod \"f259e109-20ec-4931-846e-785fe3fc4bb3\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.139689 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs88k\" (UniqueName: \"kubernetes.io/projected/f259e109-20ec-4931-846e-785fe3fc4bb3-kube-api-access-gs88k\") pod \"f259e109-20ec-4931-846e-785fe3fc4bb3\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.139814 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-catalog-content\") pod \"f259e109-20ec-4931-846e-785fe3fc4bb3\" (UID: \"f259e109-20ec-4931-846e-785fe3fc4bb3\") " Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.140387 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-utilities" (OuterVolumeSpecName: "utilities") pod "f259e109-20ec-4931-846e-785fe3fc4bb3" (UID: "f259e109-20ec-4931-846e-785fe3fc4bb3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.144254 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f259e109-20ec-4931-846e-785fe3fc4bb3-kube-api-access-gs88k" (OuterVolumeSpecName: "kube-api-access-gs88k") pod "f259e109-20ec-4931-846e-785fe3fc4bb3" (UID: "f259e109-20ec-4931-846e-785fe3fc4bb3"). InnerVolumeSpecName "kube-api-access-gs88k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.160492 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f259e109-20ec-4931-846e-785fe3fc4bb3" (UID: "f259e109-20ec-4931-846e-785fe3fc4bb3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.242709 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.242783 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs88k\" (UniqueName: \"kubernetes.io/projected/f259e109-20ec-4931-846e-785fe3fc4bb3-kube-api-access-gs88k\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.242797 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f259e109-20ec-4931-846e-785fe3fc4bb3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.304742 4691 generic.go:334] "Generic (PLEG): container finished" podID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerID="79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887" exitCode=0 Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.304842 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pz2l2" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.304828 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerDied","Data":"79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887"} Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.305019 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pz2l2" event={"ID":"f259e109-20ec-4931-846e-785fe3fc4bb3","Type":"ContainerDied","Data":"864cf9b4a95f711ffc7a018bde7f2f38551ec65a26f165087c453f1ab67a072a"} Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.305067 4691 scope.go:117] "RemoveContainer" containerID="79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.344162 4691 scope.go:117] "RemoveContainer" containerID="299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.347458 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz2l2"] Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.358577 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pz2l2"] Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.369007 4691 scope.go:117] "RemoveContainer" containerID="c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.377889 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2fdtt"] Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.397971 4691 scope.go:117] "RemoveContainer" containerID="79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887" Dec 02 08:11:20 crc kubenswrapper[4691]: E1202 08:11:20.398815 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887\": container with ID starting with 79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887 not found: ID does not exist" containerID="79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.398897 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887"} err="failed to get container status \"79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887\": rpc error: code = NotFound desc = could not find container \"79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887\": container with ID starting with 79205a1a5ba88dcc390e6af170168c010af4b8f20abf31607b0c58296a9a2887 not found: ID does not exist" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.398947 4691 scope.go:117] "RemoveContainer" containerID="299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885" Dec 02 08:11:20 crc kubenswrapper[4691]: E1202 08:11:20.400784 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885\": container with ID starting with 299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885 not found: ID does not exist" containerID="299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.400821 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885"} err="failed to get container status \"299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885\": rpc error: code = NotFound desc = could not find container \"299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885\": container with ID starting with 299b2f68b2d73a325875cf73c52dbbfaeeca9f576c51b6843998d4abb0e1d885 not found: ID does not exist" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.400850 4691 scope.go:117] "RemoveContainer" containerID="c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8" Dec 02 08:11:20 crc kubenswrapper[4691]: E1202 08:11:20.401178 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8\": container with ID starting with c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8 not found: ID does not exist" containerID="c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.401210 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8"} err="failed to get container status \"c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8\": rpc error: code = NotFound desc = could not find container \"c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8\": container with ID starting with c431c47ef92d2375d9cd8d51772c9d3ee271e3df1aeab94aabe55f84f85737e8 not found: ID does not exist" Dec 02 08:11:20 crc kubenswrapper[4691]: I1202 08:11:20.574260 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" path="/var/lib/kubelet/pods/f259e109-20ec-4931-846e-785fe3fc4bb3/volumes" Dec 02 08:11:21 crc kubenswrapper[4691]: I1202 08:11:21.317482 4691 generic.go:334] "Generic (PLEG): container finished" podID="b16b1cf5-2216-4371-891f-385f78b00021" containerID="294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2" exitCode=0 Dec 02 08:11:21 crc kubenswrapper[4691]: I1202 08:11:21.317535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerDied","Data":"294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2"} Dec 02 08:11:21 crc kubenswrapper[4691]: I1202 08:11:21.317564 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerStarted","Data":"ac08e4587114b9d00d2f7924ce7419754d7fc10702f97bc10c3fa6947e8b7b15"} Dec 02 08:11:21 crc kubenswrapper[4691]: I1202 08:11:21.320158 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:11:22 crc kubenswrapper[4691]: I1202 08:11:22.359815 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerStarted","Data":"0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03"} Dec 02 08:11:25 crc kubenswrapper[4691]: I1202 08:11:25.387080 4691 generic.go:334] "Generic (PLEG): container finished" podID="b16b1cf5-2216-4371-891f-385f78b00021" containerID="0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03" exitCode=0 Dec 02 08:11:25 crc kubenswrapper[4691]: I1202 08:11:25.387154 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerDied","Data":"0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03"} Dec 02 08:11:27 crc kubenswrapper[4691]: I1202 08:11:27.411155 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerStarted","Data":"6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d"} Dec 02 08:11:27 crc kubenswrapper[4691]: I1202 08:11:27.434929 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2fdtt" podStartSLOduration=3.32566438 podStartE2EDuration="8.434903504s" podCreationTimestamp="2025-12-02 08:11:19 +0000 UTC" firstStartedPulling="2025-12-02 08:11:21.319826149 +0000 UTC m=+1529.103905011" lastFinishedPulling="2025-12-02 08:11:26.429065273 +0000 UTC m=+1534.213144135" observedRunningTime="2025-12-02 08:11:27.427647293 +0000 UTC m=+1535.211726175" watchObservedRunningTime="2025-12-02 08:11:27.434903504 +0000 UTC m=+1535.218982366" Dec 02 08:11:28 crc kubenswrapper[4691]: I1202 08:11:28.647153 4691 scope.go:117] "RemoveContainer" containerID="c6495d7a8e820854aaed8fdebfe3bef2ad068886c18ffc5ab9fec81e65cdc7f9" Dec 02 08:11:28 crc kubenswrapper[4691]: I1202 08:11:28.668332 4691 scope.go:117] "RemoveContainer" containerID="8dd835aeec3a6b54f83f39a520ec4721c5cafac504197fd018ade1ecbfb91a73" Dec 02 08:11:29 crc kubenswrapper[4691]: I1202 08:11:29.859780 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:29 crc kubenswrapper[4691]: I1202 08:11:29.860131 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:30 crc kubenswrapper[4691]: I1202 08:11:30.925083 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2fdtt" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="registry-server" probeResult="failure" output=< Dec 02 08:11:30 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 08:11:30 crc kubenswrapper[4691]: > Dec 02 08:11:39 crc kubenswrapper[4691]: I1202 08:11:39.911218 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:39 crc kubenswrapper[4691]: I1202 08:11:39.963955 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:40 crc kubenswrapper[4691]: I1202 08:11:40.150922 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2fdtt"] Dec 02 08:11:41 crc kubenswrapper[4691]: I1202 08:11:41.539423 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2fdtt" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="registry-server" containerID="cri-o://6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d" gracePeriod=2 Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.022557 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.188339 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-utilities\") pod \"b16b1cf5-2216-4371-891f-385f78b00021\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.188453 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8trzq\" (UniqueName: \"kubernetes.io/projected/b16b1cf5-2216-4371-891f-385f78b00021-kube-api-access-8trzq\") pod \"b16b1cf5-2216-4371-891f-385f78b00021\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.188526 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-catalog-content\") pod \"b16b1cf5-2216-4371-891f-385f78b00021\" (UID: \"b16b1cf5-2216-4371-891f-385f78b00021\") " Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.190647 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-utilities" (OuterVolumeSpecName: "utilities") pod "b16b1cf5-2216-4371-891f-385f78b00021" (UID: "b16b1cf5-2216-4371-891f-385f78b00021"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.192831 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.195873 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b16b1cf5-2216-4371-891f-385f78b00021-kube-api-access-8trzq" (OuterVolumeSpecName: "kube-api-access-8trzq") pod "b16b1cf5-2216-4371-891f-385f78b00021" (UID: "b16b1cf5-2216-4371-891f-385f78b00021"). InnerVolumeSpecName "kube-api-access-8trzq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.295171 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8trzq\" (UniqueName: \"kubernetes.io/projected/b16b1cf5-2216-4371-891f-385f78b00021-kube-api-access-8trzq\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.301627 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b16b1cf5-2216-4371-891f-385f78b00021" (UID: "b16b1cf5-2216-4371-891f-385f78b00021"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.397470 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b16b1cf5-2216-4371-891f-385f78b00021-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.551869 4691 generic.go:334] "Generic (PLEG): container finished" podID="b16b1cf5-2216-4371-891f-385f78b00021" containerID="6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d" exitCode=0 Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.551900 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2fdtt" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.551927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerDied","Data":"6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d"} Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.553538 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2fdtt" event={"ID":"b16b1cf5-2216-4371-891f-385f78b00021","Type":"ContainerDied","Data":"ac08e4587114b9d00d2f7924ce7419754d7fc10702f97bc10c3fa6947e8b7b15"} Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.553562 4691 scope.go:117] "RemoveContainer" containerID="6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.588025 4691 scope.go:117] "RemoveContainer" containerID="0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.605827 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2fdtt"] Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.631686 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2fdtt"] Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.646965 4691 scope.go:117] "RemoveContainer" containerID="294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.697533 4691 scope.go:117] "RemoveContainer" containerID="6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d" Dec 02 08:11:42 crc kubenswrapper[4691]: E1202 08:11:42.699205 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d\": container with ID starting with 6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d not found: ID does not exist" containerID="6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.699324 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d"} err="failed to get container status \"6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d\": rpc error: code = NotFound desc = could not find container \"6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d\": container with ID starting with 6f2fbdb652c2a7dd5a9a90cb4b64f70b2a53e0144a1056de7e2d1405b351a33d not found: ID does not exist" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.699404 4691 scope.go:117] "RemoveContainer" containerID="0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03" Dec 02 08:11:42 crc kubenswrapper[4691]: E1202 08:11:42.702202 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03\": container with ID starting with 0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03 not found: ID does not exist" containerID="0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.702258 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03"} err="failed to get container status \"0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03\": rpc error: code = NotFound desc = could not find container \"0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03\": container with ID starting with 0bbe6930498c2d96482555417bfbfb437c220e8bd5c17614e21c0240da741c03 not found: ID does not exist" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.702294 4691 scope.go:117] "RemoveContainer" containerID="294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2" Dec 02 08:11:42 crc kubenswrapper[4691]: E1202 08:11:42.705859 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2\": container with ID starting with 294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2 not found: ID does not exist" containerID="294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2" Dec 02 08:11:42 crc kubenswrapper[4691]: I1202 08:11:42.705984 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2"} err="failed to get container status \"294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2\": rpc error: code = NotFound desc = could not find container \"294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2\": container with ID starting with 294351211b8d3f90fee90d344ae43a0cda5bbda10c2f965e551e7a297d7654b2 not found: ID does not exist" Dec 02 08:11:44 crc kubenswrapper[4691]: I1202 08:11:44.574824 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b16b1cf5-2216-4371-891f-385f78b00021" path="/var/lib/kubelet/pods/b16b1cf5-2216-4371-891f-385f78b00021/volumes" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.836143 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-z5gbp"] Dec 02 08:12:34 crc kubenswrapper[4691]: E1202 08:12:34.837242 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="registry-server" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837262 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="registry-server" Dec 02 08:12:34 crc kubenswrapper[4691]: E1202 08:12:34.837283 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="extract-utilities" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837292 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="extract-utilities" Dec 02 08:12:34 crc kubenswrapper[4691]: E1202 08:12:34.837310 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="extract-utilities" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837317 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="extract-utilities" Dec 02 08:12:34 crc kubenswrapper[4691]: E1202 08:12:34.837333 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="extract-content" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837341 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="extract-content" Dec 02 08:12:34 crc kubenswrapper[4691]: E1202 08:12:34.837365 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="extract-content" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837374 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="extract-content" Dec 02 08:12:34 crc kubenswrapper[4691]: E1202 08:12:34.837403 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="registry-server" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837411 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="registry-server" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837635 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f259e109-20ec-4931-846e-785fe3fc4bb3" containerName="registry-server" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.837672 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b16b1cf5-2216-4371-891f-385f78b00021" containerName="registry-server" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.839556 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.852275 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5gbp"] Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.980376 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-catalog-content\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.980452 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-utilities\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:34 crc kubenswrapper[4691]: I1202 08:12:34.980588 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj4m4\" (UniqueName: \"kubernetes.io/projected/1cf786db-2a9d-4a3c-ba13-c4a752da039d-kube-api-access-lj4m4\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.082637 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj4m4\" (UniqueName: \"kubernetes.io/projected/1cf786db-2a9d-4a3c-ba13-c4a752da039d-kube-api-access-lj4m4\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.082750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-catalog-content\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.082880 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-utilities\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.083399 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-utilities\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.083510 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-catalog-content\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.120050 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj4m4\" (UniqueName: \"kubernetes.io/projected/1cf786db-2a9d-4a3c-ba13-c4a752da039d-kube-api-access-lj4m4\") pod \"certified-operators-z5gbp\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.169435 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.697938 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-z5gbp"] Dec 02 08:12:35 crc kubenswrapper[4691]: I1202 08:12:35.776116 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerStarted","Data":"2604bb09bbaa912d18c69ebfef100cd3c4910c35186d333c89bd185618a0d290"} Dec 02 08:12:36 crc kubenswrapper[4691]: I1202 08:12:36.786991 4691 generic.go:334] "Generic (PLEG): container finished" podID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerID="ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648" exitCode=0 Dec 02 08:12:36 crc kubenswrapper[4691]: I1202 08:12:36.787101 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerDied","Data":"ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648"} Dec 02 08:12:37 crc kubenswrapper[4691]: I1202 08:12:37.800748 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerStarted","Data":"3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab"} Dec 02 08:12:38 crc kubenswrapper[4691]: I1202 08:12:38.811227 4691 generic.go:334] "Generic (PLEG): container finished" podID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerID="3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab" exitCode=0 Dec 02 08:12:38 crc kubenswrapper[4691]: I1202 08:12:38.811333 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerDied","Data":"3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab"} Dec 02 08:12:39 crc kubenswrapper[4691]: I1202 08:12:39.823064 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerStarted","Data":"7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b"} Dec 02 08:12:39 crc kubenswrapper[4691]: I1202 08:12:39.844117 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-z5gbp" podStartSLOduration=3.376601216 podStartE2EDuration="5.844093801s" podCreationTimestamp="2025-12-02 08:12:34 +0000 UTC" firstStartedPulling="2025-12-02 08:12:36.788907547 +0000 UTC m=+1604.572986409" lastFinishedPulling="2025-12-02 08:12:39.256400132 +0000 UTC m=+1607.040478994" observedRunningTime="2025-12-02 08:12:39.840585154 +0000 UTC m=+1607.624664016" watchObservedRunningTime="2025-12-02 08:12:39.844093801 +0000 UTC m=+1607.628172653" Dec 02 08:12:45 crc kubenswrapper[4691]: I1202 08:12:45.170264 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:45 crc kubenswrapper[4691]: I1202 08:12:45.170746 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:45 crc kubenswrapper[4691]: I1202 08:12:45.215068 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:45 crc kubenswrapper[4691]: I1202 08:12:45.929046 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:45 crc kubenswrapper[4691]: I1202 08:12:45.979396 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z5gbp"] Dec 02 08:12:47 crc kubenswrapper[4691]: I1202 08:12:47.895727 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-z5gbp" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="registry-server" containerID="cri-o://7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b" gracePeriod=2 Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.401352 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.449614 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-catalog-content\") pod \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.449707 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lj4m4\" (UniqueName: \"kubernetes.io/projected/1cf786db-2a9d-4a3c-ba13-c4a752da039d-kube-api-access-lj4m4\") pod \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.449828 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-utilities\") pod \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\" (UID: \"1cf786db-2a9d-4a3c-ba13-c4a752da039d\") " Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.450895 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-utilities" (OuterVolumeSpecName: "utilities") pod "1cf786db-2a9d-4a3c-ba13-c4a752da039d" (UID: "1cf786db-2a9d-4a3c-ba13-c4a752da039d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.457035 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cf786db-2a9d-4a3c-ba13-c4a752da039d-kube-api-access-lj4m4" (OuterVolumeSpecName: "kube-api-access-lj4m4") pod "1cf786db-2a9d-4a3c-ba13-c4a752da039d" (UID: "1cf786db-2a9d-4a3c-ba13-c4a752da039d"). InnerVolumeSpecName "kube-api-access-lj4m4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.553153 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lj4m4\" (UniqueName: \"kubernetes.io/projected/1cf786db-2a9d-4a3c-ba13-c4a752da039d-kube-api-access-lj4m4\") on node \"crc\" DevicePath \"\"" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.553187 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.818447 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1cf786db-2a9d-4a3c-ba13-c4a752da039d" (UID: "1cf786db-2a9d-4a3c-ba13-c4a752da039d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.859770 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1cf786db-2a9d-4a3c-ba13-c4a752da039d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.907203 4691 generic.go:334] "Generic (PLEG): container finished" podID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerID="7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b" exitCode=0 Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.907260 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerDied","Data":"7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b"} Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.907293 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-z5gbp" event={"ID":"1cf786db-2a9d-4a3c-ba13-c4a752da039d","Type":"ContainerDied","Data":"2604bb09bbaa912d18c69ebfef100cd3c4910c35186d333c89bd185618a0d290"} Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.907311 4691 scope.go:117] "RemoveContainer" containerID="7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.907459 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-z5gbp" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.931612 4691 scope.go:117] "RemoveContainer" containerID="3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab" Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.944437 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-z5gbp"] Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.959730 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-z5gbp"] Dec 02 08:12:48 crc kubenswrapper[4691]: I1202 08:12:48.968090 4691 scope.go:117] "RemoveContainer" containerID="ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648" Dec 02 08:12:49 crc kubenswrapper[4691]: I1202 08:12:49.015102 4691 scope.go:117] "RemoveContainer" containerID="7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b" Dec 02 08:12:49 crc kubenswrapper[4691]: E1202 08:12:49.015551 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b\": container with ID starting with 7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b not found: ID does not exist" containerID="7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b" Dec 02 08:12:49 crc kubenswrapper[4691]: I1202 08:12:49.015620 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b"} err="failed to get container status \"7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b\": rpc error: code = NotFound desc = could not find container \"7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b\": container with ID starting with 7fb4d6f0793879ed8896665803f059804a132186ec86b8c4fc1686727db3900b not found: ID does not exist" Dec 02 08:12:49 crc kubenswrapper[4691]: I1202 08:12:49.015657 4691 scope.go:117] "RemoveContainer" containerID="3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab" Dec 02 08:12:49 crc kubenswrapper[4691]: E1202 08:12:49.015990 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab\": container with ID starting with 3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab not found: ID does not exist" containerID="3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab" Dec 02 08:12:49 crc kubenswrapper[4691]: I1202 08:12:49.016020 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab"} err="failed to get container status \"3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab\": rpc error: code = NotFound desc = could not find container \"3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab\": container with ID starting with 3634f3c9651b18e0bde78b1b83413d059054bc31652f99518daa05030f6cafab not found: ID does not exist" Dec 02 08:12:49 crc kubenswrapper[4691]: I1202 08:12:49.016040 4691 scope.go:117] "RemoveContainer" containerID="ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648" Dec 02 08:12:49 crc kubenswrapper[4691]: E1202 08:12:49.016269 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648\": container with ID starting with ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648 not found: ID does not exist" containerID="ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648" Dec 02 08:12:49 crc kubenswrapper[4691]: I1202 08:12:49.016293 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648"} err="failed to get container status \"ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648\": rpc error: code = NotFound desc = could not find container \"ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648\": container with ID starting with ad9520a6bb97f7c800e526303fe6d1ebe0c11ca57d1fac585be9bb37a2c67648 not found: ID does not exist" Dec 02 08:12:50 crc kubenswrapper[4691]: I1202 08:12:50.589880 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" path="/var/lib/kubelet/pods/1cf786db-2a9d-4a3c-ba13-c4a752da039d/volumes" Dec 02 08:12:51 crc kubenswrapper[4691]: I1202 08:12:51.898922 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:12:51 crc kubenswrapper[4691]: I1202 08:12:51.899322 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:13:18 crc kubenswrapper[4691]: I1202 08:13:18.182430 4691 generic.go:334] "Generic (PLEG): container finished" podID="6eabed67-587a-402c-8d6f-02163a229356" containerID="a6fdba709896d1a466248b6adb44a72ace62723b378d200aca9e84f3562b5799" exitCode=0 Dec 02 08:13:18 crc kubenswrapper[4691]: I1202 08:13:18.182544 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" event={"ID":"6eabed67-587a-402c-8d6f-02163a229356","Type":"ContainerDied","Data":"a6fdba709896d1a466248b6adb44a72ace62723b378d200aca9e84f3562b5799"} Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.611131 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.698173 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-ssh-key\") pod \"6eabed67-587a-402c-8d6f-02163a229356\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.698304 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-inventory\") pod \"6eabed67-587a-402c-8d6f-02163a229356\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.698383 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-bootstrap-combined-ca-bundle\") pod \"6eabed67-587a-402c-8d6f-02163a229356\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.698483 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hlbv5\" (UniqueName: \"kubernetes.io/projected/6eabed67-587a-402c-8d6f-02163a229356-kube-api-access-hlbv5\") pod \"6eabed67-587a-402c-8d6f-02163a229356\" (UID: \"6eabed67-587a-402c-8d6f-02163a229356\") " Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.703997 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eabed67-587a-402c-8d6f-02163a229356-kube-api-access-hlbv5" (OuterVolumeSpecName: "kube-api-access-hlbv5") pod "6eabed67-587a-402c-8d6f-02163a229356" (UID: "6eabed67-587a-402c-8d6f-02163a229356"). InnerVolumeSpecName "kube-api-access-hlbv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.704470 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6eabed67-587a-402c-8d6f-02163a229356" (UID: "6eabed67-587a-402c-8d6f-02163a229356"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.728635 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6eabed67-587a-402c-8d6f-02163a229356" (UID: "6eabed67-587a-402c-8d6f-02163a229356"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.733542 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-inventory" (OuterVolumeSpecName: "inventory") pod "6eabed67-587a-402c-8d6f-02163a229356" (UID: "6eabed67-587a-402c-8d6f-02163a229356"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.800902 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.800956 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.800976 4691 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eabed67-587a-402c-8d6f-02163a229356-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:13:19 crc kubenswrapper[4691]: I1202 08:13:19.800995 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hlbv5\" (UniqueName: \"kubernetes.io/projected/6eabed67-587a-402c-8d6f-02163a229356-kube-api-access-hlbv5\") on node \"crc\" DevicePath \"\"" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.203826 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" event={"ID":"6eabed67-587a-402c-8d6f-02163a229356","Type":"ContainerDied","Data":"24012a4ba0f7b0e45f4f4d06b0f1be8a76eb2ee09faf31c489e4a9f954ab3332"} Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.203884 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24012a4ba0f7b0e45f4f4d06b0f1be8a76eb2ee09faf31c489e4a9f954ab3332" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.203887 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ws572" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.290045 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh"] Dec 02 08:13:20 crc kubenswrapper[4691]: E1202 08:13:20.290618 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="extract-content" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.290644 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="extract-content" Dec 02 08:13:20 crc kubenswrapper[4691]: E1202 08:13:20.290690 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eabed67-587a-402c-8d6f-02163a229356" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.290719 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eabed67-587a-402c-8d6f-02163a229356" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 08:13:20 crc kubenswrapper[4691]: E1202 08:13:20.290749 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="registry-server" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.290775 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="registry-server" Dec 02 08:13:20 crc kubenswrapper[4691]: E1202 08:13:20.290793 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="extract-utilities" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.290802 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="extract-utilities" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.291031 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cf786db-2a9d-4a3c-ba13-c4a752da039d" containerName="registry-server" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.291053 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eabed67-587a-402c-8d6f-02163a229356" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.292783 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.298596 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.298903 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.299746 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.300131 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.307195 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh"] Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.415806 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.415900 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29p4k\" (UniqueName: \"kubernetes.io/projected/90715156-30f9-4dfc-9c78-374f0a07bb4c-kube-api-access-29p4k\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.416071 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.517992 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.518137 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.518175 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29p4k\" (UniqueName: \"kubernetes.io/projected/90715156-30f9-4dfc-9c78-374f0a07bb4c-kube-api-access-29p4k\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.527830 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.530282 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.535380 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29p4k\" (UniqueName: \"kubernetes.io/projected/90715156-30f9-4dfc-9c78-374f0a07bb4c-kube-api-access-29p4k\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:20 crc kubenswrapper[4691]: I1202 08:13:20.632624 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:13:21 crc kubenswrapper[4691]: I1202 08:13:21.148815 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh"] Dec 02 08:13:21 crc kubenswrapper[4691]: I1202 08:13:21.220513 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" event={"ID":"90715156-30f9-4dfc-9c78-374f0a07bb4c","Type":"ContainerStarted","Data":"32c45277cfc3d3412431fe1e265eb842bc6cd7997df063581cf630c2c3844afd"} Dec 02 08:13:21 crc kubenswrapper[4691]: I1202 08:13:21.898712 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:13:21 crc kubenswrapper[4691]: I1202 08:13:21.899001 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:13:22 crc kubenswrapper[4691]: I1202 08:13:22.246138 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" event={"ID":"90715156-30f9-4dfc-9c78-374f0a07bb4c","Type":"ContainerStarted","Data":"e9f2fcdc7cda62c12c6aaf5ff4ba665643d5e8ffa546c996b1831c9a01202935"} Dec 02 08:13:22 crc kubenswrapper[4691]: I1202 08:13:22.270037 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" podStartSLOduration=1.72894668 podStartE2EDuration="2.27001673s" podCreationTimestamp="2025-12-02 08:13:20 +0000 UTC" firstStartedPulling="2025-12-02 08:13:21.153106955 +0000 UTC m=+1648.937185817" lastFinishedPulling="2025-12-02 08:13:21.694177005 +0000 UTC m=+1649.478255867" observedRunningTime="2025-12-02 08:13:22.262160543 +0000 UTC m=+1650.046239405" watchObservedRunningTime="2025-12-02 08:13:22.27001673 +0000 UTC m=+1650.054095592" Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.055664 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-zwd88"] Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.069905 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-dec3-account-create-update-ncxgc"] Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.081977 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-ff84-account-create-update-7z425"] Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.091556 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-zwd88"] Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.108775 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-ff84-account-create-update-7z425"] Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.120753 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-dec3-account-create-update-ncxgc"] Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.574534 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f033101-48ac-4bad-9b29-3547a7c6b721" path="/var/lib/kubelet/pods/3f033101-48ac-4bad-9b29-3547a7c6b721/volumes" Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.575780 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4821e81-a183-48bc-b558-9e1b64ad63c7" path="/var/lib/kubelet/pods/d4821e81-a183-48bc-b558-9e1b64ad63c7/volumes" Dec 02 08:13:24 crc kubenswrapper[4691]: I1202 08:13:24.576491 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db56f732-e81f-48e9-8b33-cc147b6ffc99" path="/var/lib/kubelet/pods/db56f732-e81f-48e9-8b33-cc147b6ffc99/volumes" Dec 02 08:13:25 crc kubenswrapper[4691]: I1202 08:13:25.029378 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rzknp"] Dec 02 08:13:25 crc kubenswrapper[4691]: I1202 08:13:25.040641 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rzknp"] Dec 02 08:13:26 crc kubenswrapper[4691]: I1202 08:13:26.573282 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31da07a4-df34-4c7e-91a1-9847f83ed7fd" path="/var/lib/kubelet/pods/31da07a4-df34-4c7e-91a1-9847f83ed7fd/volumes" Dec 02 08:13:27 crc kubenswrapper[4691]: I1202 08:13:27.030634 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-b036-account-create-update-z7rxp"] Dec 02 08:13:27 crc kubenswrapper[4691]: I1202 08:13:27.040072 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-c2vdf"] Dec 02 08:13:27 crc kubenswrapper[4691]: I1202 08:13:27.050210 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-c2vdf"] Dec 02 08:13:27 crc kubenswrapper[4691]: I1202 08:13:27.059666 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-b036-account-create-update-z7rxp"] Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.574310 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4640a2f1-1c73-4c41-b389-d956599766e4" path="/var/lib/kubelet/pods/4640a2f1-1c73-4c41-b389-d956599766e4/volumes" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.575224 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb2a273-6a0e-4572-bf19-509df8b07a5f" path="/var/lib/kubelet/pods/fdb2a273-6a0e-4572-bf19-509df8b07a5f/volumes" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.792082 4691 scope.go:117] "RemoveContainer" containerID="b6029f2730d172ed992afb2aa8e4f965ce51ddf5bb9f5d8a0cc5d7cd8ebe6ba3" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.830630 4691 scope.go:117] "RemoveContainer" containerID="e3ffb8fee44d8f92819e5d3d4c4aa81a51986252f008b7e59d26cdcdf348fa0c" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.885941 4691 scope.go:117] "RemoveContainer" containerID="f3535ec4678743b3963ae2dbe40c65668525394f80ae452be7078d0397d5da0e" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.909769 4691 scope.go:117] "RemoveContainer" containerID="a4c40e80369dea4bb5ab5b70a7ab69f581d13fb8e549edf9d330336147b2565c" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.956751 4691 scope.go:117] "RemoveContainer" containerID="d751872029579419eeea5baacbc41ef848de61e7a80259cc42dd7016c5e88e85" Dec 02 08:13:28 crc kubenswrapper[4691]: I1202 08:13:28.998604 4691 scope.go:117] "RemoveContainer" containerID="599e7369ee44d8258bea4c260c65084d2e146ec3f8647477681bdcb10fb3ae4c" Dec 02 08:13:29 crc kubenswrapper[4691]: I1202 08:13:29.042711 4691 scope.go:117] "RemoveContainer" containerID="ad941f9a593e9dcd53cc0715e80ad6cfe7fa78cb1f78f37efd8b047895ad3a28" Dec 02 08:13:29 crc kubenswrapper[4691]: I1202 08:13:29.081497 4691 scope.go:117] "RemoveContainer" containerID="948d22f00eea93e493851bc530194ce67e9036824eed4028b52384b4119ac0cb" Dec 02 08:13:51 crc kubenswrapper[4691]: I1202 08:13:51.898712 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:13:51 crc kubenswrapper[4691]: I1202 08:13:51.899328 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:13:51 crc kubenswrapper[4691]: I1202 08:13:51.899392 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:13:51 crc kubenswrapper[4691]: I1202 08:13:51.900384 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:13:51 crc kubenswrapper[4691]: I1202 08:13:51.900447 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" gracePeriod=600 Dec 02 08:13:52 crc kubenswrapper[4691]: E1202 08:13:52.031856 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:13:52 crc kubenswrapper[4691]: I1202 08:13:52.571948 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" exitCode=0 Dec 02 08:13:52 crc kubenswrapper[4691]: I1202 08:13:52.576357 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4"} Dec 02 08:13:52 crc kubenswrapper[4691]: I1202 08:13:52.576414 4691 scope.go:117] "RemoveContainer" containerID="35bf2b176e04ba95989431e7a1c5a8ad045d68a6d864e710ee0e03b73b56f536" Dec 02 08:13:52 crc kubenswrapper[4691]: I1202 08:13:52.577278 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:13:52 crc kubenswrapper[4691]: E1202 08:13:52.577591 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:14:05 crc kubenswrapper[4691]: I1202 08:14:05.562182 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:14:05 crc kubenswrapper[4691]: E1202 08:14:05.563018 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.047810 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-lwfnn"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.058479 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-c46db"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.072989 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-76a8-account-create-update-bmt2m"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.082465 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-hqmqz"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.090805 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-c46db"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.099821 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-lwfnn"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.108481 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-hqmqz"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.116797 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-f40f-account-create-update-9lqsp"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.124869 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b89c-account-create-update-5qdcl"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.133272 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-76a8-account-create-update-bmt2m"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.141167 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-f40f-account-create-update-9lqsp"] Dec 02 08:14:09 crc kubenswrapper[4691]: I1202 08:14:09.149119 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b89c-account-create-update-5qdcl"] Dec 02 08:14:10 crc kubenswrapper[4691]: I1202 08:14:10.574697 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c376c7e-d8b9-4f7d-943b-ed3300a60c2d" path="/var/lib/kubelet/pods/3c376c7e-d8b9-4f7d-943b-ed3300a60c2d/volumes" Dec 02 08:14:10 crc kubenswrapper[4691]: I1202 08:14:10.576020 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62ec2865-a83a-43c0-a786-e5a22b7a7008" path="/var/lib/kubelet/pods/62ec2865-a83a-43c0-a786-e5a22b7a7008/volumes" Dec 02 08:14:10 crc kubenswrapper[4691]: I1202 08:14:10.576833 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7425c462-167a-4f21-9eee-afb9b7b1767e" path="/var/lib/kubelet/pods/7425c462-167a-4f21-9eee-afb9b7b1767e/volumes" Dec 02 08:14:10 crc kubenswrapper[4691]: I1202 08:14:10.577698 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92913a64-59ad-47ca-ad56-a2ad01fbc281" path="/var/lib/kubelet/pods/92913a64-59ad-47ca-ad56-a2ad01fbc281/volumes" Dec 02 08:14:10 crc kubenswrapper[4691]: I1202 08:14:10.579193 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad884439-9cac-4888-902a-81c30359a9b9" path="/var/lib/kubelet/pods/ad884439-9cac-4888-902a-81c30359a9b9/volumes" Dec 02 08:14:10 crc kubenswrapper[4691]: I1202 08:14:10.580001 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b465b231-332d-4086-a3eb-dcf8f02278b8" path="/var/lib/kubelet/pods/b465b231-332d-4086-a3eb-dcf8f02278b8/volumes" Dec 02 08:14:18 crc kubenswrapper[4691]: I1202 08:14:18.562617 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:14:18 crc kubenswrapper[4691]: E1202 08:14:18.563187 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:14:19 crc kubenswrapper[4691]: I1202 08:14:19.033623 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-nxz46"] Dec 02 08:14:19 crc kubenswrapper[4691]: I1202 08:14:19.044874 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-nxz46"] Dec 02 08:14:20 crc kubenswrapper[4691]: I1202 08:14:20.573168 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e8994b-8c3a-4086-babd-3e79a67aae9b" path="/var/lib/kubelet/pods/c9e8994b-8c3a-4086-babd-3e79a67aae9b/volumes" Dec 02 08:14:21 crc kubenswrapper[4691]: I1202 08:14:21.040131 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-nxcgj"] Dec 02 08:14:21 crc kubenswrapper[4691]: I1202 08:14:21.054323 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-nxcgj"] Dec 02 08:14:22 crc kubenswrapper[4691]: I1202 08:14:22.574128 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e433058-a34d-4156-9a25-07a573d1c4d2" path="/var/lib/kubelet/pods/4e433058-a34d-4156-9a25-07a573d1c4d2/volumes" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.235645 4691 scope.go:117] "RemoveContainer" containerID="4727e5fb0790a48545f8ffb77ea8c8ae35747dccd94f7c8422e2c2d82ab566b8" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.266300 4691 scope.go:117] "RemoveContainer" containerID="8a298c05f7b53b3e8cd6b0bbaafb5d9be090886f3e1a82502b4ab3ae6c9c34bb" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.318732 4691 scope.go:117] "RemoveContainer" containerID="db04fb8cda17884a359e715d4b327344ffa90fbc34a3fa6c8386d8f09bee61ea" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.356852 4691 scope.go:117] "RemoveContainer" containerID="f1b59aa075304131bb6bbf8aa041011b103f08d90ce36861d16bb40fd71cb070" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.402411 4691 scope.go:117] "RemoveContainer" containerID="77f128d5bae6769cd4c4a22a70e7c45068adcec9139a6fa01b26443bbacd4909" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.465492 4691 scope.go:117] "RemoveContainer" containerID="7d27aa6e1a6a0f5ae69df5a48c3d5b0eb21d285919a5bf79167f074d28837bac" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.493508 4691 scope.go:117] "RemoveContainer" containerID="2b217d80dc4558ad11c48b8693c2a8f77a7e90e6b4edc082aaa97eeef516ccd9" Dec 02 08:14:29 crc kubenswrapper[4691]: I1202 08:14:29.518720 4691 scope.go:117] "RemoveContainer" containerID="34a11708d751994a7640e1c70737b8672f0866a7d5df6ef924491cc82aa9d617" Dec 02 08:14:33 crc kubenswrapper[4691]: I1202 08:14:33.562259 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:14:33 crc kubenswrapper[4691]: E1202 08:14:33.563010 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:14:45 crc kubenswrapper[4691]: I1202 08:14:45.562364 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:14:45 crc kubenswrapper[4691]: E1202 08:14:45.563129 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:14:56 crc kubenswrapper[4691]: I1202 08:14:56.051396 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-4tvs4"] Dec 02 08:14:56 crc kubenswrapper[4691]: I1202 08:14:56.061100 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-4tvs4"] Dec 02 08:14:56 crc kubenswrapper[4691]: I1202 08:14:56.208606 4691 generic.go:334] "Generic (PLEG): container finished" podID="90715156-30f9-4dfc-9c78-374f0a07bb4c" containerID="e9f2fcdc7cda62c12c6aaf5ff4ba665643d5e8ffa546c996b1831c9a01202935" exitCode=0 Dec 02 08:14:56 crc kubenswrapper[4691]: I1202 08:14:56.208932 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" event={"ID":"90715156-30f9-4dfc-9c78-374f0a07bb4c","Type":"ContainerDied","Data":"e9f2fcdc7cda62c12c6aaf5ff4ba665643d5e8ffa546c996b1831c9a01202935"} Dec 02 08:14:56 crc kubenswrapper[4691]: I1202 08:14:56.573541 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7c46467-e60b-47fd-be7b-660d674b6504" path="/var/lib/kubelet/pods/d7c46467-e60b-47fd-be7b-660d674b6504/volumes" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.561736 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:14:57 crc kubenswrapper[4691]: E1202 08:14:57.562579 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.613170 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.632246 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-ssh-key\") pod \"90715156-30f9-4dfc-9c78-374f0a07bb4c\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.632499 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29p4k\" (UniqueName: \"kubernetes.io/projected/90715156-30f9-4dfc-9c78-374f0a07bb4c-kube-api-access-29p4k\") pod \"90715156-30f9-4dfc-9c78-374f0a07bb4c\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.632561 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-inventory\") pod \"90715156-30f9-4dfc-9c78-374f0a07bb4c\" (UID: \"90715156-30f9-4dfc-9c78-374f0a07bb4c\") " Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.639961 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90715156-30f9-4dfc-9c78-374f0a07bb4c-kube-api-access-29p4k" (OuterVolumeSpecName: "kube-api-access-29p4k") pod "90715156-30f9-4dfc-9c78-374f0a07bb4c" (UID: "90715156-30f9-4dfc-9c78-374f0a07bb4c"). InnerVolumeSpecName "kube-api-access-29p4k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.671262 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "90715156-30f9-4dfc-9c78-374f0a07bb4c" (UID: "90715156-30f9-4dfc-9c78-374f0a07bb4c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.671300 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-inventory" (OuterVolumeSpecName: "inventory") pod "90715156-30f9-4dfc-9c78-374f0a07bb4c" (UID: "90715156-30f9-4dfc-9c78-374f0a07bb4c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.735527 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.736037 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/90715156-30f9-4dfc-9c78-374f0a07bb4c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:14:57 crc kubenswrapper[4691]: I1202 08:14:57.736053 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29p4k\" (UniqueName: \"kubernetes.io/projected/90715156-30f9-4dfc-9c78-374f0a07bb4c-kube-api-access-29p4k\") on node \"crc\" DevicePath \"\"" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.249051 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" event={"ID":"90715156-30f9-4dfc-9c78-374f0a07bb4c","Type":"ContainerDied","Data":"32c45277cfc3d3412431fe1e265eb842bc6cd7997df063581cf630c2c3844afd"} Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.249105 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32c45277cfc3d3412431fe1e265eb842bc6cd7997df063581cf630c2c3844afd" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.249192 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.318952 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq"] Dec 02 08:14:58 crc kubenswrapper[4691]: E1202 08:14:58.319479 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90715156-30f9-4dfc-9c78-374f0a07bb4c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.319500 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="90715156-30f9-4dfc-9c78-374f0a07bb4c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.319706 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="90715156-30f9-4dfc-9c78-374f0a07bb4c" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.320584 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.324443 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.324443 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.326538 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.328555 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq"] Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.332790 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.450519 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w5fs\" (UniqueName: \"kubernetes.io/projected/9c3bebb2-7f42-4553-83b6-7fafbb022c70-kube-api-access-9w5fs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.450713 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.450751 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.553087 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.553151 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.553203 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w5fs\" (UniqueName: \"kubernetes.io/projected/9c3bebb2-7f42-4553-83b6-7fafbb022c70-kube-api-access-9w5fs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.556942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.558975 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.569488 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w5fs\" (UniqueName: \"kubernetes.io/projected/9c3bebb2-7f42-4553-83b6-7fafbb022c70-kube-api-access-9w5fs\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:58 crc kubenswrapper[4691]: I1202 08:14:58.645021 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:14:59 crc kubenswrapper[4691]: I1202 08:14:59.164386 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq"] Dec 02 08:14:59 crc kubenswrapper[4691]: I1202 08:14:59.258562 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" event={"ID":"9c3bebb2-7f42-4553-83b6-7fafbb022c70","Type":"ContainerStarted","Data":"03a426fed36b5364e8bc62cb8bf3efb1eb8c54a66379471f7b822bd36a46c113"} Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.134080 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs"] Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.135959 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.139142 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.139642 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.166094 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs"] Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.190640 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3a5baae-3ed8-4d6c-8b6c-81569413503e-config-volume\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.190742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcbv4\" (UniqueName: \"kubernetes.io/projected/e3a5baae-3ed8-4d6c-8b6c-81569413503e-kube-api-access-kcbv4\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.190811 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3a5baae-3ed8-4d6c-8b6c-81569413503e-secret-volume\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.269694 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" event={"ID":"9c3bebb2-7f42-4553-83b6-7fafbb022c70","Type":"ContainerStarted","Data":"23536c86e18ec519ff4dbd28c54adedfb4d39168bfe36103c0fac80ed57ab87c"} Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.294698 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3a5baae-3ed8-4d6c-8b6c-81569413503e-config-volume\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.294879 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcbv4\" (UniqueName: \"kubernetes.io/projected/e3a5baae-3ed8-4d6c-8b6c-81569413503e-kube-api-access-kcbv4\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.294910 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3a5baae-3ed8-4d6c-8b6c-81569413503e-secret-volume\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.296055 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3a5baae-3ed8-4d6c-8b6c-81569413503e-config-volume\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.297618 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" podStartSLOduration=1.6505002260000001 podStartE2EDuration="2.297603746s" podCreationTimestamp="2025-12-02 08:14:58 +0000 UTC" firstStartedPulling="2025-12-02 08:14:59.172444041 +0000 UTC m=+1746.956522903" lastFinishedPulling="2025-12-02 08:14:59.819547571 +0000 UTC m=+1747.603626423" observedRunningTime="2025-12-02 08:15:00.293198385 +0000 UTC m=+1748.077277267" watchObservedRunningTime="2025-12-02 08:15:00.297603746 +0000 UTC m=+1748.081682608" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.300587 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3a5baae-3ed8-4d6c-8b6c-81569413503e-secret-volume\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.313944 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcbv4\" (UniqueName: \"kubernetes.io/projected/e3a5baae-3ed8-4d6c-8b6c-81569413503e-kube-api-access-kcbv4\") pod \"collect-profiles-29411055-56vvs\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:00 crc kubenswrapper[4691]: I1202 08:15:00.555906 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:01 crc kubenswrapper[4691]: I1202 08:15:01.018459 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs"] Dec 02 08:15:01 crc kubenswrapper[4691]: W1202 08:15:01.023177 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a5baae_3ed8_4d6c_8b6c_81569413503e.slice/crio-ede4a4fdcada547f3a33a996b184d67891ec1e0e4987d20009373afa8f8edfa1 WatchSource:0}: Error finding container ede4a4fdcada547f3a33a996b184d67891ec1e0e4987d20009373afa8f8edfa1: Status 404 returned error can't find the container with id ede4a4fdcada547f3a33a996b184d67891ec1e0e4987d20009373afa8f8edfa1 Dec 02 08:15:01 crc kubenswrapper[4691]: I1202 08:15:01.285904 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" event={"ID":"e3a5baae-3ed8-4d6c-8b6c-81569413503e","Type":"ContainerStarted","Data":"0299b0430c8051be7f1504692503ce23d299866cdfe54676e3c5ee0236b508c1"} Dec 02 08:15:01 crc kubenswrapper[4691]: I1202 08:15:01.285960 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" event={"ID":"e3a5baae-3ed8-4d6c-8b6c-81569413503e","Type":"ContainerStarted","Data":"ede4a4fdcada547f3a33a996b184d67891ec1e0e4987d20009373afa8f8edfa1"} Dec 02 08:15:01 crc kubenswrapper[4691]: I1202 08:15:01.300973 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" podStartSLOduration=1.300951004 podStartE2EDuration="1.300951004s" podCreationTimestamp="2025-12-02 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:15:01.298740468 +0000 UTC m=+1749.082819340" watchObservedRunningTime="2025-12-02 08:15:01.300951004 +0000 UTC m=+1749.085029866" Dec 02 08:15:02 crc kubenswrapper[4691]: I1202 08:15:02.296532 4691 generic.go:334] "Generic (PLEG): container finished" podID="e3a5baae-3ed8-4d6c-8b6c-81569413503e" containerID="0299b0430c8051be7f1504692503ce23d299866cdfe54676e3c5ee0236b508c1" exitCode=0 Dec 02 08:15:02 crc kubenswrapper[4691]: I1202 08:15:02.296654 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" event={"ID":"e3a5baae-3ed8-4d6c-8b6c-81569413503e","Type":"ContainerDied","Data":"0299b0430c8051be7f1504692503ce23d299866cdfe54676e3c5ee0236b508c1"} Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.661039 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.766793 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3a5baae-3ed8-4d6c-8b6c-81569413503e-config-volume\") pod \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.767018 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3a5baae-3ed8-4d6c-8b6c-81569413503e-secret-volume\") pod \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.767172 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcbv4\" (UniqueName: \"kubernetes.io/projected/e3a5baae-3ed8-4d6c-8b6c-81569413503e-kube-api-access-kcbv4\") pod \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\" (UID: \"e3a5baae-3ed8-4d6c-8b6c-81569413503e\") " Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.768790 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a5baae-3ed8-4d6c-8b6c-81569413503e-config-volume" (OuterVolumeSpecName: "config-volume") pod "e3a5baae-3ed8-4d6c-8b6c-81569413503e" (UID: "e3a5baae-3ed8-4d6c-8b6c-81569413503e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.774300 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a5baae-3ed8-4d6c-8b6c-81569413503e-kube-api-access-kcbv4" (OuterVolumeSpecName: "kube-api-access-kcbv4") pod "e3a5baae-3ed8-4d6c-8b6c-81569413503e" (UID: "e3a5baae-3ed8-4d6c-8b6c-81569413503e"). InnerVolumeSpecName "kube-api-access-kcbv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.778409 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3a5baae-3ed8-4d6c-8b6c-81569413503e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e3a5baae-3ed8-4d6c-8b6c-81569413503e" (UID: "e3a5baae-3ed8-4d6c-8b6c-81569413503e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.870085 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e3a5baae-3ed8-4d6c-8b6c-81569413503e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.870489 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e3a5baae-3ed8-4d6c-8b6c-81569413503e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:15:03 crc kubenswrapper[4691]: I1202 08:15:03.870580 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcbv4\" (UniqueName: \"kubernetes.io/projected/e3a5baae-3ed8-4d6c-8b6c-81569413503e-kube-api-access-kcbv4\") on node \"crc\" DevicePath \"\"" Dec 02 08:15:04 crc kubenswrapper[4691]: I1202 08:15:04.029743 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2zwm7"] Dec 02 08:15:04 crc kubenswrapper[4691]: I1202 08:15:04.043384 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2zwm7"] Dec 02 08:15:04 crc kubenswrapper[4691]: I1202 08:15:04.323432 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" event={"ID":"e3a5baae-3ed8-4d6c-8b6c-81569413503e","Type":"ContainerDied","Data":"ede4a4fdcada547f3a33a996b184d67891ec1e0e4987d20009373afa8f8edfa1"} Dec 02 08:15:04 crc kubenswrapper[4691]: I1202 08:15:04.323510 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ede4a4fdcada547f3a33a996b184d67891ec1e0e4987d20009373afa8f8edfa1" Dec 02 08:15:04 crc kubenswrapper[4691]: I1202 08:15:04.323515 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs" Dec 02 08:15:04 crc kubenswrapper[4691]: I1202 08:15:04.577258 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="570d8c0d-670c-4132-85e4-e13633c3bcc2" path="/var/lib/kubelet/pods/570d8c0d-670c-4132-85e4-e13633c3bcc2/volumes" Dec 02 08:15:07 crc kubenswrapper[4691]: I1202 08:15:07.034503 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-svfbx"] Dec 02 08:15:07 crc kubenswrapper[4691]: I1202 08:15:07.043231 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-svfbx"] Dec 02 08:15:08 crc kubenswrapper[4691]: I1202 08:15:08.574969 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94701dd1-f34b-4bdc-bf74-f67799127cd5" path="/var/lib/kubelet/pods/94701dd1-f34b-4bdc-bf74-f67799127cd5/volumes" Dec 02 08:15:11 crc kubenswrapper[4691]: I1202 08:15:11.562079 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:15:11 crc kubenswrapper[4691]: E1202 08:15:11.562775 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:15:21 crc kubenswrapper[4691]: I1202 08:15:21.049536 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-7sh8f"] Dec 02 08:15:21 crc kubenswrapper[4691]: I1202 08:15:21.060254 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-6qz2r"] Dec 02 08:15:21 crc kubenswrapper[4691]: I1202 08:15:21.095834 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-6qz2r"] Dec 02 08:15:21 crc kubenswrapper[4691]: I1202 08:15:21.108510 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-7sh8f"] Dec 02 08:15:22 crc kubenswrapper[4691]: I1202 08:15:22.568891 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:15:22 crc kubenswrapper[4691]: E1202 08:15:22.569201 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:15:22 crc kubenswrapper[4691]: I1202 08:15:22.580539 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f939f3c-07b4-42b8-94d9-3dbd15c03287" path="/var/lib/kubelet/pods/7f939f3c-07b4-42b8-94d9-3dbd15c03287/volumes" Dec 02 08:15:22 crc kubenswrapper[4691]: I1202 08:15:22.581439 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a217e1fe-be30-4247-91f3-020aaa089689" path="/var/lib/kubelet/pods/a217e1fe-be30-4247-91f3-020aaa089689/volumes" Dec 02 08:15:29 crc kubenswrapper[4691]: I1202 08:15:29.702926 4691 scope.go:117] "RemoveContainer" containerID="50ab6b0f70d3b855a58ba47bf17de058c6cf69c41cb33c8e150774441b7cdded" Dec 02 08:15:29 crc kubenswrapper[4691]: I1202 08:15:29.749489 4691 scope.go:117] "RemoveContainer" containerID="2bcee9f6f940f70280315d05b9510f179ada01021af10d3e5e1665f8a838fe67" Dec 02 08:15:29 crc kubenswrapper[4691]: I1202 08:15:29.797177 4691 scope.go:117] "RemoveContainer" containerID="7f40968c6bdee9e8fb71c6ec0ba22bcb3f072ed076a6b75797013af2e2e45f94" Dec 02 08:15:29 crc kubenswrapper[4691]: I1202 08:15:29.842425 4691 scope.go:117] "RemoveContainer" containerID="3463b4007cb4ad8471129346ff206ae38b9fa2426f91d358ddde396ea6ce2149" Dec 02 08:15:29 crc kubenswrapper[4691]: I1202 08:15:29.898027 4691 scope.go:117] "RemoveContainer" containerID="225c4639c39f31f374a90598a5cf3035e03cc249f5b00247468d64d85a944dd3" Dec 02 08:15:37 crc kubenswrapper[4691]: I1202 08:15:37.561453 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:15:37 crc kubenswrapper[4691]: E1202 08:15:37.562479 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:15:49 crc kubenswrapper[4691]: I1202 08:15:49.561480 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:15:49 crc kubenswrapper[4691]: E1202 08:15:49.562364 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:16:00 crc kubenswrapper[4691]: I1202 08:16:00.562132 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:16:00 crc kubenswrapper[4691]: E1202 08:16:00.562871 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.054516 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-6ba3-account-create-update-qdwjw"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.064216 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-zdmsm"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.091031 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-4p5d6"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.102174 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0427-account-create-update-kprdn"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.111093 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-jtr9g"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.124330 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-6ba3-account-create-update-qdwjw"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.132602 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-jtr9g"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.139901 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0427-account-create-update-kprdn"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.146935 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-zdmsm"] Dec 02 08:16:07 crc kubenswrapper[4691]: I1202 08:16:07.154069 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-4p5d6"] Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.044224 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2741-account-create-update-v7c8q"] Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.055317 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2741-account-create-update-v7c8q"] Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.575881 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0df042e0-5c89-43c2-aa13-6e894851bc4b" path="/var/lib/kubelet/pods/0df042e0-5c89-43c2-aa13-6e894851bc4b/volumes" Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.576535 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa3343a-ce66-4872-9b9e-d011b842e4d1" path="/var/lib/kubelet/pods/1aa3343a-ce66-4872-9b9e-d011b842e4d1/volumes" Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.577526 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37f44801-39b7-4ed2-b8b6-3b15e740e058" path="/var/lib/kubelet/pods/37f44801-39b7-4ed2-b8b6-3b15e740e058/volumes" Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.578445 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911d250b-4d53-4d08-aee0-a92de349d7f1" path="/var/lib/kubelet/pods/911d250b-4d53-4d08-aee0-a92de349d7f1/volumes" Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.584113 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1" path="/var/lib/kubelet/pods/a5b0b24f-61f2-4abf-8dd5-6cecb20ac4c1/volumes" Dec 02 08:16:08 crc kubenswrapper[4691]: I1202 08:16:08.585554 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb15868-5efc-4bc1-a297-9c4517cd23ee" path="/var/lib/kubelet/pods/eeb15868-5efc-4bc1-a297-9c4517cd23ee/volumes" Dec 02 08:16:09 crc kubenswrapper[4691]: I1202 08:16:09.026054 4691 generic.go:334] "Generic (PLEG): container finished" podID="9c3bebb2-7f42-4553-83b6-7fafbb022c70" containerID="23536c86e18ec519ff4dbd28c54adedfb4d39168bfe36103c0fac80ed57ab87c" exitCode=0 Dec 02 08:16:09 crc kubenswrapper[4691]: I1202 08:16:09.026115 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" event={"ID":"9c3bebb2-7f42-4553-83b6-7fafbb022c70","Type":"ContainerDied","Data":"23536c86e18ec519ff4dbd28c54adedfb4d39168bfe36103c0fac80ed57ab87c"} Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.440317 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.567750 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-ssh-key\") pod \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.567981 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9w5fs\" (UniqueName: \"kubernetes.io/projected/9c3bebb2-7f42-4553-83b6-7fafbb022c70-kube-api-access-9w5fs\") pod \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.568095 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-inventory\") pod \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\" (UID: \"9c3bebb2-7f42-4553-83b6-7fafbb022c70\") " Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.579931 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3bebb2-7f42-4553-83b6-7fafbb022c70-kube-api-access-9w5fs" (OuterVolumeSpecName: "kube-api-access-9w5fs") pod "9c3bebb2-7f42-4553-83b6-7fafbb022c70" (UID: "9c3bebb2-7f42-4553-83b6-7fafbb022c70"). InnerVolumeSpecName "kube-api-access-9w5fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.610580 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c3bebb2-7f42-4553-83b6-7fafbb022c70" (UID: "9c3bebb2-7f42-4553-83b6-7fafbb022c70"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.617437 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-inventory" (OuterVolumeSpecName: "inventory") pod "9c3bebb2-7f42-4553-83b6-7fafbb022c70" (UID: "9c3bebb2-7f42-4553-83b6-7fafbb022c70"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.679290 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9w5fs\" (UniqueName: \"kubernetes.io/projected/9c3bebb2-7f42-4553-83b6-7fafbb022c70-kube-api-access-9w5fs\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.679327 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:10 crc kubenswrapper[4691]: I1202 08:16:10.679337 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c3bebb2-7f42-4553-83b6-7fafbb022c70-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.044311 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" event={"ID":"9c3bebb2-7f42-4553-83b6-7fafbb022c70","Type":"ContainerDied","Data":"03a426fed36b5364e8bc62cb8bf3efb1eb8c54a66379471f7b822bd36a46c113"} Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.044355 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03a426fed36b5364e8bc62cb8bf3efb1eb8c54a66379471f7b822bd36a46c113" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.044408 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.149890 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6"] Dec 02 08:16:11 crc kubenswrapper[4691]: E1202 08:16:11.150452 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3bebb2-7f42-4553-83b6-7fafbb022c70" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.150475 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3bebb2-7f42-4553-83b6-7fafbb022c70" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 08:16:11 crc kubenswrapper[4691]: E1202 08:16:11.150504 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a5baae-3ed8-4d6c-8b6c-81569413503e" containerName="collect-profiles" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.150512 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a5baae-3ed8-4d6c-8b6c-81569413503e" containerName="collect-profiles" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.150749 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3bebb2-7f42-4553-83b6-7fafbb022c70" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.150795 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a5baae-3ed8-4d6c-8b6c-81569413503e" containerName="collect-profiles" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.151637 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.154542 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.154827 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.158972 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.160998 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.161068 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6"] Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.293259 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.293308 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.293674 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9lzr\" (UniqueName: \"kubernetes.io/projected/5f7ee74e-e2c8-4144-9643-4df288709175-kube-api-access-n9lzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.395323 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.395389 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.395515 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9lzr\" (UniqueName: \"kubernetes.io/projected/5f7ee74e-e2c8-4144-9643-4df288709175-kube-api-access-n9lzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.400781 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.404278 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.415149 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9lzr\" (UniqueName: \"kubernetes.io/projected/5f7ee74e-e2c8-4144-9643-4df288709175-kube-api-access-n9lzr\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-67hl6\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:11 crc kubenswrapper[4691]: I1202 08:16:11.481936 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:12 crc kubenswrapper[4691]: I1202 08:16:12.169349 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6"] Dec 02 08:16:13 crc kubenswrapper[4691]: I1202 08:16:13.064272 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" event={"ID":"5f7ee74e-e2c8-4144-9643-4df288709175","Type":"ContainerStarted","Data":"7c0599c164aa5c285c4ebcac4308560429372af1999af77c618ac59065f37441"} Dec 02 08:16:14 crc kubenswrapper[4691]: I1202 08:16:14.074254 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" event={"ID":"5f7ee74e-e2c8-4144-9643-4df288709175","Type":"ContainerStarted","Data":"599912aca3c1aa91cf5c7e323b11b294698b610e941a77019f3ef37fec8ac9ad"} Dec 02 08:16:14 crc kubenswrapper[4691]: I1202 08:16:14.100057 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" podStartSLOduration=2.0959271250000002 podStartE2EDuration="3.100029752s" podCreationTimestamp="2025-12-02 08:16:11 +0000 UTC" firstStartedPulling="2025-12-02 08:16:12.169774371 +0000 UTC m=+1819.953853233" lastFinishedPulling="2025-12-02 08:16:13.173876998 +0000 UTC m=+1820.957955860" observedRunningTime="2025-12-02 08:16:14.091358174 +0000 UTC m=+1821.875437036" watchObservedRunningTime="2025-12-02 08:16:14.100029752 +0000 UTC m=+1821.884108614" Dec 02 08:16:14 crc kubenswrapper[4691]: I1202 08:16:14.562231 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:16:14 crc kubenswrapper[4691]: E1202 08:16:14.562458 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:16:19 crc kubenswrapper[4691]: I1202 08:16:19.118712 4691 generic.go:334] "Generic (PLEG): container finished" podID="5f7ee74e-e2c8-4144-9643-4df288709175" containerID="599912aca3c1aa91cf5c7e323b11b294698b610e941a77019f3ef37fec8ac9ad" exitCode=0 Dec 02 08:16:19 crc kubenswrapper[4691]: I1202 08:16:19.118788 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" event={"ID":"5f7ee74e-e2c8-4144-9643-4df288709175","Type":"ContainerDied","Data":"599912aca3c1aa91cf5c7e323b11b294698b610e941a77019f3ef37fec8ac9ad"} Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.834840 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.959088 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-ssh-key\") pod \"5f7ee74e-e2c8-4144-9643-4df288709175\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.959160 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9lzr\" (UniqueName: \"kubernetes.io/projected/5f7ee74e-e2c8-4144-9643-4df288709175-kube-api-access-n9lzr\") pod \"5f7ee74e-e2c8-4144-9643-4df288709175\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.959275 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-inventory\") pod \"5f7ee74e-e2c8-4144-9643-4df288709175\" (UID: \"5f7ee74e-e2c8-4144-9643-4df288709175\") " Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.965286 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7ee74e-e2c8-4144-9643-4df288709175-kube-api-access-n9lzr" (OuterVolumeSpecName: "kube-api-access-n9lzr") pod "5f7ee74e-e2c8-4144-9643-4df288709175" (UID: "5f7ee74e-e2c8-4144-9643-4df288709175"). InnerVolumeSpecName "kube-api-access-n9lzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.990295 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5f7ee74e-e2c8-4144-9643-4df288709175" (UID: "5f7ee74e-e2c8-4144-9643-4df288709175"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:16:20 crc kubenswrapper[4691]: I1202 08:16:20.992973 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-inventory" (OuterVolumeSpecName: "inventory") pod "5f7ee74e-e2c8-4144-9643-4df288709175" (UID: "5f7ee74e-e2c8-4144-9643-4df288709175"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.062593 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.062640 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9lzr\" (UniqueName: \"kubernetes.io/projected/5f7ee74e-e2c8-4144-9643-4df288709175-kube-api-access-n9lzr\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.062652 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f7ee74e-e2c8-4144-9643-4df288709175-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.136349 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" event={"ID":"5f7ee74e-e2c8-4144-9643-4df288709175","Type":"ContainerDied","Data":"7c0599c164aa5c285c4ebcac4308560429372af1999af77c618ac59065f37441"} Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.136405 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-67hl6" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.136410 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c0599c164aa5c285c4ebcac4308560429372af1999af77c618ac59065f37441" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.208432 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx"] Dec 02 08:16:21 crc kubenswrapper[4691]: E1202 08:16:21.208980 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7ee74e-e2c8-4144-9643-4df288709175" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.209006 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7ee74e-e2c8-4144-9643-4df288709175" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.209277 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7ee74e-e2c8-4144-9643-4df288709175" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.210342 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.212738 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.213192 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.213513 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.215268 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.219839 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx"] Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.267496 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf7g7\" (UniqueName: \"kubernetes.io/projected/c4804dc1-5ac2-422e-87fe-71120becde69-kube-api-access-xf7g7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.267550 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.267705 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.370339 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.370506 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf7g7\" (UniqueName: \"kubernetes.io/projected/c4804dc1-5ac2-422e-87fe-71120becde69-kube-api-access-xf7g7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.370559 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.375138 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.375506 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.387043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf7g7\" (UniqueName: \"kubernetes.io/projected/c4804dc1-5ac2-422e-87fe-71120becde69-kube-api-access-xf7g7\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-lx2xx\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:21 crc kubenswrapper[4691]: I1202 08:16:21.531295 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:16:22 crc kubenswrapper[4691]: I1202 08:16:22.069373 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx"] Dec 02 08:16:22 crc kubenswrapper[4691]: W1202 08:16:22.070097 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4804dc1_5ac2_422e_87fe_71120becde69.slice/crio-32b1e6c60fab689b0bb60c1671c5342944e1f384b9a786ac7bf43bae81b6a7af WatchSource:0}: Error finding container 32b1e6c60fab689b0bb60c1671c5342944e1f384b9a786ac7bf43bae81b6a7af: Status 404 returned error can't find the container with id 32b1e6c60fab689b0bb60c1671c5342944e1f384b9a786ac7bf43bae81b6a7af Dec 02 08:16:22 crc kubenswrapper[4691]: I1202 08:16:22.072404 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:16:22 crc kubenswrapper[4691]: I1202 08:16:22.145915 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" event={"ID":"c4804dc1-5ac2-422e-87fe-71120becde69","Type":"ContainerStarted","Data":"32b1e6c60fab689b0bb60c1671c5342944e1f384b9a786ac7bf43bae81b6a7af"} Dec 02 08:16:24 crc kubenswrapper[4691]: I1202 08:16:24.283403 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" event={"ID":"c4804dc1-5ac2-422e-87fe-71120becde69","Type":"ContainerStarted","Data":"3f2624b84bbc70d54404b7842e587dc17ab4667f43deea61ea037bb28ad92596"} Dec 02 08:16:24 crc kubenswrapper[4691]: I1202 08:16:24.307189 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" podStartSLOduration=2.623960123 podStartE2EDuration="3.307172932s" podCreationTimestamp="2025-12-02 08:16:21 +0000 UTC" firstStartedPulling="2025-12-02 08:16:22.07221199 +0000 UTC m=+1829.856290852" lastFinishedPulling="2025-12-02 08:16:22.755424799 +0000 UTC m=+1830.539503661" observedRunningTime="2025-12-02 08:16:24.30072673 +0000 UTC m=+1832.084805592" watchObservedRunningTime="2025-12-02 08:16:24.307172932 +0000 UTC m=+1832.091251784" Dec 02 08:16:27 crc kubenswrapper[4691]: I1202 08:16:27.561806 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:16:27 crc kubenswrapper[4691]: E1202 08:16:27.806343 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:16:30 crc kubenswrapper[4691]: I1202 08:16:30.018831 4691 scope.go:117] "RemoveContainer" containerID="6f25090cf32215af12e79085686c46515d96ebaeb0f3fb194a41b702c640f956" Dec 02 08:16:30 crc kubenswrapper[4691]: I1202 08:16:30.401540 4691 scope.go:117] "RemoveContainer" containerID="7d68c532baf5e3f8b3001b2becb50642186c0dace1229a7271a41b7365150086" Dec 02 08:16:30 crc kubenswrapper[4691]: I1202 08:16:30.441677 4691 scope.go:117] "RemoveContainer" containerID="3cadf93969314b116f62c846bac718e0bd2b10279e5621f5912ce1e1e7adbe52" Dec 02 08:16:30 crc kubenswrapper[4691]: I1202 08:16:30.486252 4691 scope.go:117] "RemoveContainer" containerID="695c054ef2acea72b1510c0324ce6accc69b94c49396976ff578ed978168ef76" Dec 02 08:16:30 crc kubenswrapper[4691]: I1202 08:16:30.525629 4691 scope.go:117] "RemoveContainer" containerID="84036d7dbc0f957d3fd002f3566d5ad1390b5e8436bc3db79e4d600dec2c129a" Dec 02 08:16:30 crc kubenswrapper[4691]: I1202 08:16:30.569589 4691 scope.go:117] "RemoveContainer" containerID="eed906381f0e4b018d0fe3d4c29052fc2c4ef4d0983999c691e997fd7798372c" Dec 02 08:16:38 crc kubenswrapper[4691]: I1202 08:16:38.047367 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8dbg"] Dec 02 08:16:38 crc kubenswrapper[4691]: I1202 08:16:38.057640 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-p8dbg"] Dec 02 08:16:38 crc kubenswrapper[4691]: I1202 08:16:38.573092 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc9085a-c030-4fd8-bb83-ad19b91315ba" path="/var/lib/kubelet/pods/ecc9085a-c030-4fd8-bb83-ad19b91315ba/volumes" Dec 02 08:16:42 crc kubenswrapper[4691]: I1202 08:16:42.574248 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:16:42 crc kubenswrapper[4691]: E1202 08:16:42.574963 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:16:57 crc kubenswrapper[4691]: I1202 08:16:57.562292 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:16:57 crc kubenswrapper[4691]: E1202 08:16:57.563064 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:17:02 crc kubenswrapper[4691]: I1202 08:17:02.038496 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-bdd2r"] Dec 02 08:17:02 crc kubenswrapper[4691]: I1202 08:17:02.051504 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-bdd2r"] Dec 02 08:17:02 crc kubenswrapper[4691]: I1202 08:17:02.571724 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e610cf-6ebe-4904-afe8-749d466fa6eb" path="/var/lib/kubelet/pods/90e610cf-6ebe-4904-afe8-749d466fa6eb/volumes" Dec 02 08:17:04 crc kubenswrapper[4691]: I1202 08:17:04.037173 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5bh6t"] Dec 02 08:17:04 crc kubenswrapper[4691]: I1202 08:17:04.044478 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-5bh6t"] Dec 02 08:17:04 crc kubenswrapper[4691]: I1202 08:17:04.472537 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4804dc1-5ac2-422e-87fe-71120becde69" containerID="3f2624b84bbc70d54404b7842e587dc17ab4667f43deea61ea037bb28ad92596" exitCode=0 Dec 02 08:17:04 crc kubenswrapper[4691]: I1202 08:17:04.472586 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" event={"ID":"c4804dc1-5ac2-422e-87fe-71120becde69","Type":"ContainerDied","Data":"3f2624b84bbc70d54404b7842e587dc17ab4667f43deea61ea037bb28ad92596"} Dec 02 08:17:04 crc kubenswrapper[4691]: I1202 08:17:04.576992 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3963ab52-3d58-4201-baa1-6743421bdca3" path="/var/lib/kubelet/pods/3963ab52-3d58-4201-baa1-6743421bdca3/volumes" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:05.999864 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.168885 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xf7g7\" (UniqueName: \"kubernetes.io/projected/c4804dc1-5ac2-422e-87fe-71120becde69-kube-api-access-xf7g7\") pod \"c4804dc1-5ac2-422e-87fe-71120becde69\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.168945 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-ssh-key\") pod \"c4804dc1-5ac2-422e-87fe-71120becde69\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.168976 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-inventory\") pod \"c4804dc1-5ac2-422e-87fe-71120becde69\" (UID: \"c4804dc1-5ac2-422e-87fe-71120becde69\") " Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.188246 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4804dc1-5ac2-422e-87fe-71120becde69-kube-api-access-xf7g7" (OuterVolumeSpecName: "kube-api-access-xf7g7") pod "c4804dc1-5ac2-422e-87fe-71120becde69" (UID: "c4804dc1-5ac2-422e-87fe-71120becde69"). InnerVolumeSpecName "kube-api-access-xf7g7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.205787 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-inventory" (OuterVolumeSpecName: "inventory") pod "c4804dc1-5ac2-422e-87fe-71120becde69" (UID: "c4804dc1-5ac2-422e-87fe-71120becde69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.220806 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c4804dc1-5ac2-422e-87fe-71120becde69" (UID: "c4804dc1-5ac2-422e-87fe-71120becde69"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.271363 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xf7g7\" (UniqueName: \"kubernetes.io/projected/c4804dc1-5ac2-422e-87fe-71120becde69-kube-api-access-xf7g7\") on node \"crc\" DevicePath \"\"" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.271400 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.271410 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c4804dc1-5ac2-422e-87fe-71120becde69-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.507745 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" event={"ID":"c4804dc1-5ac2-422e-87fe-71120becde69","Type":"ContainerDied","Data":"32b1e6c60fab689b0bb60c1671c5342944e1f384b9a786ac7bf43bae81b6a7af"} Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.507841 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32b1e6c60fab689b0bb60c1671c5342944e1f384b9a786ac7bf43bae81b6a7af" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.507811 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-lx2xx" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.592745 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66"] Dec 02 08:17:06 crc kubenswrapper[4691]: E1202 08:17:06.593669 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4804dc1-5ac2-422e-87fe-71120becde69" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.593741 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4804dc1-5ac2-422e-87fe-71120becde69" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.594043 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4804dc1-5ac2-422e-87fe-71120becde69" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.594960 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.611228 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66"] Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.611887 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.612305 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.612490 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.612639 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.680856 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.681026 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.681258 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d9wh\" (UniqueName: \"kubernetes.io/projected/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-kube-api-access-9d9wh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.783959 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.784082 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.784196 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d9wh\" (UniqueName: \"kubernetes.io/projected/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-kube-api-access-9d9wh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.789606 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.790636 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.805713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d9wh\" (UniqueName: \"kubernetes.io/projected/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-kube-api-access-9d9wh\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-lvg66\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:06 crc kubenswrapper[4691]: I1202 08:17:06.919880 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:17:07 crc kubenswrapper[4691]: I1202 08:17:07.539533 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66"] Dec 02 08:17:08 crc kubenswrapper[4691]: I1202 08:17:08.529201 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" event={"ID":"4fde2bba-1e5a-47a2-a918-8e57f11e6d95","Type":"ContainerStarted","Data":"4d484e0693b6045633f3f118611db4778e6f39bb62cff1bdf1e89c8b43ff81d8"} Dec 02 08:17:09 crc kubenswrapper[4691]: I1202 08:17:09.557710 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" event={"ID":"4fde2bba-1e5a-47a2-a918-8e57f11e6d95","Type":"ContainerStarted","Data":"498394df58293111c7d943d0494fe671f62b04ebfb0b8954779500a6e3a4bd9a"} Dec 02 08:17:09 crc kubenswrapper[4691]: I1202 08:17:09.578924 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" podStartSLOduration=2.848915172 podStartE2EDuration="3.578900498s" podCreationTimestamp="2025-12-02 08:17:06 +0000 UTC" firstStartedPulling="2025-12-02 08:17:07.550822404 +0000 UTC m=+1875.334901266" lastFinishedPulling="2025-12-02 08:17:08.28080773 +0000 UTC m=+1876.064886592" observedRunningTime="2025-12-02 08:17:09.573232755 +0000 UTC m=+1877.357311637" watchObservedRunningTime="2025-12-02 08:17:09.578900498 +0000 UTC m=+1877.362979360" Dec 02 08:17:10 crc kubenswrapper[4691]: I1202 08:17:10.561414 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:17:10 crc kubenswrapper[4691]: E1202 08:17:10.562837 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:17:22 crc kubenswrapper[4691]: I1202 08:17:22.569551 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:17:22 crc kubenswrapper[4691]: E1202 08:17:22.570520 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:17:30 crc kubenswrapper[4691]: I1202 08:17:30.737681 4691 scope.go:117] "RemoveContainer" containerID="b43372f37196cc7c10c33b8f309e2ac4d0c3cbb7f0fee43448aecd1ce4547a0d" Dec 02 08:17:30 crc kubenswrapper[4691]: I1202 08:17:30.802382 4691 scope.go:117] "RemoveContainer" containerID="feb8f9c0f6ecbf6f086af84ef48419cd78a9a8cc4e31332a3143c405b1fc6a1b" Dec 02 08:17:30 crc kubenswrapper[4691]: I1202 08:17:30.848546 4691 scope.go:117] "RemoveContainer" containerID="c8ee537f19bcb052d12c87824157b6652a90e3b8b567d5e70a320cd6f372eb98" Dec 02 08:17:34 crc kubenswrapper[4691]: I1202 08:17:34.562085 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:17:34 crc kubenswrapper[4691]: E1202 08:17:34.562857 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:17:48 crc kubenswrapper[4691]: I1202 08:17:48.041539 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbjvb"] Dec 02 08:17:48 crc kubenswrapper[4691]: I1202 08:17:48.049479 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-kbjvb"] Dec 02 08:17:48 crc kubenswrapper[4691]: I1202 08:17:48.562565 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:17:48 crc kubenswrapper[4691]: E1202 08:17:48.562982 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:17:48 crc kubenswrapper[4691]: I1202 08:17:48.573743 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86775270-21b6-4ffa-a279-8bc76a6ca396" path="/var/lib/kubelet/pods/86775270-21b6-4ffa-a279-8bc76a6ca396/volumes" Dec 02 08:18:00 crc kubenswrapper[4691]: I1202 08:18:00.024150 4691 generic.go:334] "Generic (PLEG): container finished" podID="4fde2bba-1e5a-47a2-a918-8e57f11e6d95" containerID="498394df58293111c7d943d0494fe671f62b04ebfb0b8954779500a6e3a4bd9a" exitCode=0 Dec 02 08:18:00 crc kubenswrapper[4691]: I1202 08:18:00.024213 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" event={"ID":"4fde2bba-1e5a-47a2-a918-8e57f11e6d95","Type":"ContainerDied","Data":"498394df58293111c7d943d0494fe671f62b04ebfb0b8954779500a6e3a4bd9a"} Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.465082 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.568322 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-inventory\") pod \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.568560 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d9wh\" (UniqueName: \"kubernetes.io/projected/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-kube-api-access-9d9wh\") pod \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.568689 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-ssh-key\") pod \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\" (UID: \"4fde2bba-1e5a-47a2-a918-8e57f11e6d95\") " Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.577152 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-kube-api-access-9d9wh" (OuterVolumeSpecName: "kube-api-access-9d9wh") pod "4fde2bba-1e5a-47a2-a918-8e57f11e6d95" (UID: "4fde2bba-1e5a-47a2-a918-8e57f11e6d95"). InnerVolumeSpecName "kube-api-access-9d9wh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.606675 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-inventory" (OuterVolumeSpecName: "inventory") pod "4fde2bba-1e5a-47a2-a918-8e57f11e6d95" (UID: "4fde2bba-1e5a-47a2-a918-8e57f11e6d95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.608380 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4fde2bba-1e5a-47a2-a918-8e57f11e6d95" (UID: "4fde2bba-1e5a-47a2-a918-8e57f11e6d95"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.671247 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.671301 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:01 crc kubenswrapper[4691]: I1202 08:18:01.671314 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d9wh\" (UniqueName: \"kubernetes.io/projected/4fde2bba-1e5a-47a2-a918-8e57f11e6d95-kube-api-access-9d9wh\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.051354 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" event={"ID":"4fde2bba-1e5a-47a2-a918-8e57f11e6d95","Type":"ContainerDied","Data":"4d484e0693b6045633f3f118611db4778e6f39bb62cff1bdf1e89c8b43ff81d8"} Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.051979 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d484e0693b6045633f3f118611db4778e6f39bb62cff1bdf1e89c8b43ff81d8" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.051444 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-lvg66" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.144138 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fsbvf"] Dec 02 08:18:02 crc kubenswrapper[4691]: E1202 08:18:02.144560 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fde2bba-1e5a-47a2-a918-8e57f11e6d95" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.144578 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fde2bba-1e5a-47a2-a918-8e57f11e6d95" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.144805 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fde2bba-1e5a-47a2-a918-8e57f11e6d95" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.145548 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.150328 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.150543 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.150425 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.150599 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.154445 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fsbvf"] Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.282435 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.282639 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.282672 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phx9\" (UniqueName: \"kubernetes.io/projected/bfc83744-d9e3-4520-96ea-2ce6e382af39-kube-api-access-7phx9\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.383968 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.384398 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.384523 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phx9\" (UniqueName: \"kubernetes.io/projected/bfc83744-d9e3-4520-96ea-2ce6e382af39-kube-api-access-7phx9\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.388660 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.389656 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.403447 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phx9\" (UniqueName: \"kubernetes.io/projected/bfc83744-d9e3-4520-96ea-2ce6e382af39-kube-api-access-7phx9\") pod \"ssh-known-hosts-edpm-deployment-fsbvf\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.462521 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.588151 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:18:02 crc kubenswrapper[4691]: E1202 08:18:02.588392 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:18:02 crc kubenswrapper[4691]: W1202 08:18:02.953597 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfc83744_d9e3_4520_96ea_2ce6e382af39.slice/crio-7c45fa529045baefdba45f3bc02d68956441c24030cf3683e5286925a2810074 WatchSource:0}: Error finding container 7c45fa529045baefdba45f3bc02d68956441c24030cf3683e5286925a2810074: Status 404 returned error can't find the container with id 7c45fa529045baefdba45f3bc02d68956441c24030cf3683e5286925a2810074 Dec 02 08:18:02 crc kubenswrapper[4691]: I1202 08:18:02.953908 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-fsbvf"] Dec 02 08:18:03 crc kubenswrapper[4691]: I1202 08:18:03.059658 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" event={"ID":"bfc83744-d9e3-4520-96ea-2ce6e382af39","Type":"ContainerStarted","Data":"7c45fa529045baefdba45f3bc02d68956441c24030cf3683e5286925a2810074"} Dec 02 08:18:04 crc kubenswrapper[4691]: I1202 08:18:04.070023 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" event={"ID":"bfc83744-d9e3-4520-96ea-2ce6e382af39","Type":"ContainerStarted","Data":"e657347631a1c6d1fd746fec648bff95fe14420844be36363292a13ec6a6e6a2"} Dec 02 08:18:04 crc kubenswrapper[4691]: I1202 08:18:04.095463 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" podStartSLOduration=1.628294968 podStartE2EDuration="2.095444713s" podCreationTimestamp="2025-12-02 08:18:02 +0000 UTC" firstStartedPulling="2025-12-02 08:18:02.955692989 +0000 UTC m=+1930.739771841" lastFinishedPulling="2025-12-02 08:18:03.422842684 +0000 UTC m=+1931.206921586" observedRunningTime="2025-12-02 08:18:04.088513664 +0000 UTC m=+1931.872592516" watchObservedRunningTime="2025-12-02 08:18:04.095444713 +0000 UTC m=+1931.879523575" Dec 02 08:18:11 crc kubenswrapper[4691]: I1202 08:18:11.135467 4691 generic.go:334] "Generic (PLEG): container finished" podID="bfc83744-d9e3-4520-96ea-2ce6e382af39" containerID="e657347631a1c6d1fd746fec648bff95fe14420844be36363292a13ec6a6e6a2" exitCode=0 Dec 02 08:18:11 crc kubenswrapper[4691]: I1202 08:18:11.135561 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" event={"ID":"bfc83744-d9e3-4520-96ea-2ce6e382af39","Type":"ContainerDied","Data":"e657347631a1c6d1fd746fec648bff95fe14420844be36363292a13ec6a6e6a2"} Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.553620 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.686615 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-ssh-key-openstack-edpm-ipam\") pod \"bfc83744-d9e3-4520-96ea-2ce6e382af39\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.686728 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7phx9\" (UniqueName: \"kubernetes.io/projected/bfc83744-d9e3-4520-96ea-2ce6e382af39-kube-api-access-7phx9\") pod \"bfc83744-d9e3-4520-96ea-2ce6e382af39\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.686784 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-inventory-0\") pod \"bfc83744-d9e3-4520-96ea-2ce6e382af39\" (UID: \"bfc83744-d9e3-4520-96ea-2ce6e382af39\") " Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.693105 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfc83744-d9e3-4520-96ea-2ce6e382af39-kube-api-access-7phx9" (OuterVolumeSpecName: "kube-api-access-7phx9") pod "bfc83744-d9e3-4520-96ea-2ce6e382af39" (UID: "bfc83744-d9e3-4520-96ea-2ce6e382af39"). InnerVolumeSpecName "kube-api-access-7phx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.736865 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "bfc83744-d9e3-4520-96ea-2ce6e382af39" (UID: "bfc83744-d9e3-4520-96ea-2ce6e382af39"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.741222 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "bfc83744-d9e3-4520-96ea-2ce6e382af39" (UID: "bfc83744-d9e3-4520-96ea-2ce6e382af39"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.791437 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.791469 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7phx9\" (UniqueName: \"kubernetes.io/projected/bfc83744-d9e3-4520-96ea-2ce6e382af39-kube-api-access-7phx9\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:12 crc kubenswrapper[4691]: I1202 08:18:12.791478 4691 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/bfc83744-d9e3-4520-96ea-2ce6e382af39-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.156689 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" event={"ID":"bfc83744-d9e3-4520-96ea-2ce6e382af39","Type":"ContainerDied","Data":"7c45fa529045baefdba45f3bc02d68956441c24030cf3683e5286925a2810074"} Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.157041 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c45fa529045baefdba45f3bc02d68956441c24030cf3683e5286925a2810074" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.156753 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-fsbvf" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.239486 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7"] Dec 02 08:18:13 crc kubenswrapper[4691]: E1202 08:18:13.240112 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfc83744-d9e3-4520-96ea-2ce6e382af39" containerName="ssh-known-hosts-edpm-deployment" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.240132 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfc83744-d9e3-4520-96ea-2ce6e382af39" containerName="ssh-known-hosts-edpm-deployment" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.240403 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfc83744-d9e3-4520-96ea-2ce6e382af39" containerName="ssh-known-hosts-edpm-deployment" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.241334 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.245209 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.245412 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.246019 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.246042 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.251699 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7"] Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.303543 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.303662 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.303695 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgkrw\" (UniqueName: \"kubernetes.io/projected/1607b522-c05f-4f86-b8cb-79caa03799ed-kube-api-access-tgkrw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.406020 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.406154 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.406190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgkrw\" (UniqueName: \"kubernetes.io/projected/1607b522-c05f-4f86-b8cb-79caa03799ed-kube-api-access-tgkrw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.416869 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.417315 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.422909 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgkrw\" (UniqueName: \"kubernetes.io/projected/1607b522-c05f-4f86-b8cb-79caa03799ed-kube-api-access-tgkrw\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-nq6h7\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:13 crc kubenswrapper[4691]: I1202 08:18:13.574214 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:14 crc kubenswrapper[4691]: I1202 08:18:14.104870 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7"] Dec 02 08:18:14 crc kubenswrapper[4691]: I1202 08:18:14.172017 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" event={"ID":"1607b522-c05f-4f86-b8cb-79caa03799ed","Type":"ContainerStarted","Data":"2c8d323696b32b2155f7c922409365620e2bffdffc886956a8fa1064756d887d"} Dec 02 08:18:15 crc kubenswrapper[4691]: I1202 08:18:15.182296 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" event={"ID":"1607b522-c05f-4f86-b8cb-79caa03799ed","Type":"ContainerStarted","Data":"df52b10ac3855f3788458a3fae341fa1620bd8dc8ccb4b04e8c82d1848879966"} Dec 02 08:18:15 crc kubenswrapper[4691]: I1202 08:18:15.204422 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" podStartSLOduration=1.621841936 podStartE2EDuration="2.204368062s" podCreationTimestamp="2025-12-02 08:18:13 +0000 UTC" firstStartedPulling="2025-12-02 08:18:14.108587114 +0000 UTC m=+1941.892665966" lastFinishedPulling="2025-12-02 08:18:14.69111323 +0000 UTC m=+1942.475192092" observedRunningTime="2025-12-02 08:18:15.200477677 +0000 UTC m=+1942.984556569" watchObservedRunningTime="2025-12-02 08:18:15.204368062 +0000 UTC m=+1942.988446924" Dec 02 08:18:17 crc kubenswrapper[4691]: I1202 08:18:17.561874 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:18:17 crc kubenswrapper[4691]: E1202 08:18:17.562435 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:18:23 crc kubenswrapper[4691]: I1202 08:18:23.263482 4691 generic.go:334] "Generic (PLEG): container finished" podID="1607b522-c05f-4f86-b8cb-79caa03799ed" containerID="df52b10ac3855f3788458a3fae341fa1620bd8dc8ccb4b04e8c82d1848879966" exitCode=0 Dec 02 08:18:23 crc kubenswrapper[4691]: I1202 08:18:23.264183 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" event={"ID":"1607b522-c05f-4f86-b8cb-79caa03799ed","Type":"ContainerDied","Data":"df52b10ac3855f3788458a3fae341fa1620bd8dc8ccb4b04e8c82d1848879966"} Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.718153 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.732737 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgkrw\" (UniqueName: \"kubernetes.io/projected/1607b522-c05f-4f86-b8cb-79caa03799ed-kube-api-access-tgkrw\") pod \"1607b522-c05f-4f86-b8cb-79caa03799ed\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.733054 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-inventory\") pod \"1607b522-c05f-4f86-b8cb-79caa03799ed\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.733157 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-ssh-key\") pod \"1607b522-c05f-4f86-b8cb-79caa03799ed\" (UID: \"1607b522-c05f-4f86-b8cb-79caa03799ed\") " Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.739607 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1607b522-c05f-4f86-b8cb-79caa03799ed-kube-api-access-tgkrw" (OuterVolumeSpecName: "kube-api-access-tgkrw") pod "1607b522-c05f-4f86-b8cb-79caa03799ed" (UID: "1607b522-c05f-4f86-b8cb-79caa03799ed"). InnerVolumeSpecName "kube-api-access-tgkrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.762064 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1607b522-c05f-4f86-b8cb-79caa03799ed" (UID: "1607b522-c05f-4f86-b8cb-79caa03799ed"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.768562 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-inventory" (OuterVolumeSpecName: "inventory") pod "1607b522-c05f-4f86-b8cb-79caa03799ed" (UID: "1607b522-c05f-4f86-b8cb-79caa03799ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.836144 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.836193 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1607b522-c05f-4f86-b8cb-79caa03799ed-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:24 crc kubenswrapper[4691]: I1202 08:18:24.836204 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgkrw\" (UniqueName: \"kubernetes.io/projected/1607b522-c05f-4f86-b8cb-79caa03799ed-kube-api-access-tgkrw\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.280676 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" event={"ID":"1607b522-c05f-4f86-b8cb-79caa03799ed","Type":"ContainerDied","Data":"2c8d323696b32b2155f7c922409365620e2bffdffc886956a8fa1064756d887d"} Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.280719 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c8d323696b32b2155f7c922409365620e2bffdffc886956a8fa1064756d887d" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.280728 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-nq6h7" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.360032 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx"] Dec 02 08:18:25 crc kubenswrapper[4691]: E1202 08:18:25.360657 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1607b522-c05f-4f86-b8cb-79caa03799ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.360685 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1607b522-c05f-4f86-b8cb-79caa03799ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.360899 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1607b522-c05f-4f86-b8cb-79caa03799ed" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.361783 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.365729 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.365992 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.366047 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.366063 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.372719 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx"] Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.449242 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.449340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdr5z\" (UniqueName: \"kubernetes.io/projected/31d7c220-1ece-46e7-bbe3-1737890c15e0-kube-api-access-sdr5z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.449372 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.551303 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdr5z\" (UniqueName: \"kubernetes.io/projected/31d7c220-1ece-46e7-bbe3-1737890c15e0-kube-api-access-sdr5z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.551376 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.551486 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.558914 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.559352 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.570361 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdr5z\" (UniqueName: \"kubernetes.io/projected/31d7c220-1ece-46e7-bbe3-1737890c15e0-kube-api-access-sdr5z\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:25 crc kubenswrapper[4691]: I1202 08:18:25.684032 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:26 crc kubenswrapper[4691]: I1202 08:18:26.190239 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx"] Dec 02 08:18:26 crc kubenswrapper[4691]: I1202 08:18:26.291502 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" event={"ID":"31d7c220-1ece-46e7-bbe3-1737890c15e0","Type":"ContainerStarted","Data":"81404709e9346f8b786825299c81f06a835539b0a9e6dd2ea29620792cf0dbd5"} Dec 02 08:18:27 crc kubenswrapper[4691]: I1202 08:18:27.311900 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" event={"ID":"31d7c220-1ece-46e7-bbe3-1737890c15e0","Type":"ContainerStarted","Data":"5060a0af14c26af9ad76ab6e67c43291042ec2ad4310808b44991219f95f759c"} Dec 02 08:18:27 crc kubenswrapper[4691]: I1202 08:18:27.341803 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" podStartSLOduration=1.831972746 podStartE2EDuration="2.341786854s" podCreationTimestamp="2025-12-02 08:18:25 +0000 UTC" firstStartedPulling="2025-12-02 08:18:26.197727734 +0000 UTC m=+1953.981806596" lastFinishedPulling="2025-12-02 08:18:26.707541842 +0000 UTC m=+1954.491620704" observedRunningTime="2025-12-02 08:18:27.341411824 +0000 UTC m=+1955.125490686" watchObservedRunningTime="2025-12-02 08:18:27.341786854 +0000 UTC m=+1955.125865716" Dec 02 08:18:29 crc kubenswrapper[4691]: I1202 08:18:29.561557 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:18:29 crc kubenswrapper[4691]: E1202 08:18:29.561836 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:18:30 crc kubenswrapper[4691]: I1202 08:18:30.959728 4691 scope.go:117] "RemoveContainer" containerID="8a7c013cd08c51aa6b83f3c007a59d89c34b491d8cc38c528c437d2190932fe0" Dec 02 08:18:37 crc kubenswrapper[4691]: I1202 08:18:37.400832 4691 generic.go:334] "Generic (PLEG): container finished" podID="31d7c220-1ece-46e7-bbe3-1737890c15e0" containerID="5060a0af14c26af9ad76ab6e67c43291042ec2ad4310808b44991219f95f759c" exitCode=0 Dec 02 08:18:37 crc kubenswrapper[4691]: I1202 08:18:37.400917 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" event={"ID":"31d7c220-1ece-46e7-bbe3-1737890c15e0","Type":"ContainerDied","Data":"5060a0af14c26af9ad76ab6e67c43291042ec2ad4310808b44991219f95f759c"} Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.807918 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.932109 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdr5z\" (UniqueName: \"kubernetes.io/projected/31d7c220-1ece-46e7-bbe3-1737890c15e0-kube-api-access-sdr5z\") pod \"31d7c220-1ece-46e7-bbe3-1737890c15e0\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.932464 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-inventory\") pod \"31d7c220-1ece-46e7-bbe3-1737890c15e0\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.932625 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-ssh-key\") pod \"31d7c220-1ece-46e7-bbe3-1737890c15e0\" (UID: \"31d7c220-1ece-46e7-bbe3-1737890c15e0\") " Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.938922 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d7c220-1ece-46e7-bbe3-1737890c15e0-kube-api-access-sdr5z" (OuterVolumeSpecName: "kube-api-access-sdr5z") pod "31d7c220-1ece-46e7-bbe3-1737890c15e0" (UID: "31d7c220-1ece-46e7-bbe3-1737890c15e0"). InnerVolumeSpecName "kube-api-access-sdr5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.964023 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "31d7c220-1ece-46e7-bbe3-1737890c15e0" (UID: "31d7c220-1ece-46e7-bbe3-1737890c15e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:38 crc kubenswrapper[4691]: I1202 08:18:38.966773 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-inventory" (OuterVolumeSpecName: "inventory") pod "31d7c220-1ece-46e7-bbe3-1737890c15e0" (UID: "31d7c220-1ece-46e7-bbe3-1737890c15e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.035822 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdr5z\" (UniqueName: \"kubernetes.io/projected/31d7c220-1ece-46e7-bbe3-1737890c15e0-kube-api-access-sdr5z\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.035894 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.035912 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/31d7c220-1ece-46e7-bbe3-1737890c15e0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.421415 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" event={"ID":"31d7c220-1ece-46e7-bbe3-1737890c15e0","Type":"ContainerDied","Data":"81404709e9346f8b786825299c81f06a835539b0a9e6dd2ea29620792cf0dbd5"} Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.421483 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81404709e9346f8b786825299c81f06a835539b0a9e6dd2ea29620792cf0dbd5" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.421492 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.516671 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt"] Dec 02 08:18:39 crc kubenswrapper[4691]: E1202 08:18:39.517392 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31d7c220-1ece-46e7-bbe3-1737890c15e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.517494 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="31d7c220-1ece-46e7-bbe3-1737890c15e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.517897 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="31d7c220-1ece-46e7-bbe3-1737890c15e0" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.519001 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.522751 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.523079 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.523206 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.523384 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.523087 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.523703 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.524123 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.528564 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.538869 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt"] Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.649853 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wnl\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-kube-api-access-k4wnl\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.649915 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.649981 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.650040 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.650112 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.650145 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.650936 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651041 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651192 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651375 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651464 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651538 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.651769 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754156 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wnl\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-kube-api-access-k4wnl\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754206 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754235 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754274 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754319 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754340 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754377 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754393 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754423 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754450 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754473 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754501 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754542 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.754579 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.759637 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.759882 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.760187 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.761029 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.761079 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.761589 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.762533 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.762554 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.763390 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.763837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.765017 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.766210 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.771386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.772291 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wnl\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-kube-api-access-k4wnl\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:39 crc kubenswrapper[4691]: I1202 08:18:39.839805 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:18:40 crc kubenswrapper[4691]: I1202 08:18:40.393660 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt"] Dec 02 08:18:40 crc kubenswrapper[4691]: I1202 08:18:40.432468 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" event={"ID":"80054e7f-3448-487b-8f0e-fc5eda159e57","Type":"ContainerStarted","Data":"1e1e47a59ac65ea7a6cce3e8fecbd72b15fe1108e41fe9f4568c46a9d04a4696"} Dec 02 08:18:41 crc kubenswrapper[4691]: I1202 08:18:41.442052 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" event={"ID":"80054e7f-3448-487b-8f0e-fc5eda159e57","Type":"ContainerStarted","Data":"0bfb54fc4ea5bbdfc234696d550d08deec388e3d6ede04a4e03874e68fe6c56d"} Dec 02 08:18:41 crc kubenswrapper[4691]: I1202 08:18:41.466903 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" podStartSLOduration=1.918442555 podStartE2EDuration="2.466884007s" podCreationTimestamp="2025-12-02 08:18:39 +0000 UTC" firstStartedPulling="2025-12-02 08:18:40.399307609 +0000 UTC m=+1968.183386471" lastFinishedPulling="2025-12-02 08:18:40.947749061 +0000 UTC m=+1968.731827923" observedRunningTime="2025-12-02 08:18:41.458922282 +0000 UTC m=+1969.243001144" watchObservedRunningTime="2025-12-02 08:18:41.466884007 +0000 UTC m=+1969.250962869" Dec 02 08:18:43 crc kubenswrapper[4691]: I1202 08:18:43.566053 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:18:43 crc kubenswrapper[4691]: E1202 08:18:43.566610 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:18:57 crc kubenswrapper[4691]: I1202 08:18:57.562184 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:18:58 crc kubenswrapper[4691]: I1202 08:18:58.635383 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"76431cd8b2e3d8ce369a661c072e2a35d2afb457a7614f3fda9fe111278924cc"} Dec 02 08:19:18 crc kubenswrapper[4691]: I1202 08:19:18.825026 4691 generic.go:334] "Generic (PLEG): container finished" podID="80054e7f-3448-487b-8f0e-fc5eda159e57" containerID="0bfb54fc4ea5bbdfc234696d550d08deec388e3d6ede04a4e03874e68fe6c56d" exitCode=0 Dec 02 08:19:18 crc kubenswrapper[4691]: I1202 08:19:18.825214 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" event={"ID":"80054e7f-3448-487b-8f0e-fc5eda159e57","Type":"ContainerDied","Data":"0bfb54fc4ea5bbdfc234696d550d08deec388e3d6ede04a4e03874e68fe6c56d"} Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.388570 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.495319 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.495369 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4wnl\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-kube-api-access-k4wnl\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.495398 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-bootstrap-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.495416 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-ovn-default-certs-0\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.495469 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ssh-key\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.495553 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-telemetry-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.496283 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ovn-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.496417 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-repo-setup-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.496461 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.496580 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-nova-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.496730 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-neutron-metadata-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.496874 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.497054 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-inventory\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.497248 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-libvirt-combined-ca-bundle\") pod \"80054e7f-3448-487b-8f0e-fc5eda159e57\" (UID: \"80054e7f-3448-487b-8f0e-fc5eda159e57\") " Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.504418 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.505119 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.505215 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.506303 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.506318 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-kube-api-access-k4wnl" (OuterVolumeSpecName: "kube-api-access-k4wnl") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "kube-api-access-k4wnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.506512 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.506999 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.507065 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.507150 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.508088 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.509273 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.510362 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.534830 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-inventory" (OuterVolumeSpecName: "inventory") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.536411 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "80054e7f-3448-487b-8f0e-fc5eda159e57" (UID: "80054e7f-3448-487b-8f0e-fc5eda159e57"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.600986 4691 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601272 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601391 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4wnl\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-kube-api-access-k4wnl\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601456 4691 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601623 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601710 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601799 4691 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601859 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601915 4691 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.601982 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.602233 4691 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.602288 4691 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.602346 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/80054e7f-3448-487b-8f0e-fc5eda159e57-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.602407 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/80054e7f-3448-487b-8f0e-fc5eda159e57-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.852536 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" event={"ID":"80054e7f-3448-487b-8f0e-fc5eda159e57","Type":"ContainerDied","Data":"1e1e47a59ac65ea7a6cce3e8fecbd72b15fe1108e41fe9f4568c46a9d04a4696"} Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.852904 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e1e47a59ac65ea7a6cce3e8fecbd72b15fe1108e41fe9f4568c46a9d04a4696" Dec 02 08:19:20 crc kubenswrapper[4691]: I1202 08:19:20.852691 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.000220 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82"] Dec 02 08:19:21 crc kubenswrapper[4691]: E1202 08:19:21.000883 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80054e7f-3448-487b-8f0e-fc5eda159e57" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.000911 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="80054e7f-3448-487b-8f0e-fc5eda159e57" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.001154 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="80054e7f-3448-487b-8f0e-fc5eda159e57" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.002542 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.006629 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.007370 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.007706 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.007799 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.008073 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.017166 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82"] Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.019284 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.019335 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9p2\" (UniqueName: \"kubernetes.io/projected/87c36120-368b-47ab-baff-e007b39fc1d0-kube-api-access-kk9p2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.019362 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.019521 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/87c36120-368b-47ab-baff-e007b39fc1d0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.019600 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.122069 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.122235 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.122281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9p2\" (UniqueName: \"kubernetes.io/projected/87c36120-368b-47ab-baff-e007b39fc1d0-kube-api-access-kk9p2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.122315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.122447 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/87c36120-368b-47ab-baff-e007b39fc1d0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.124049 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/87c36120-368b-47ab-baff-e007b39fc1d0-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.128226 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.135825 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.137324 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.140717 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9p2\" (UniqueName: \"kubernetes.io/projected/87c36120-368b-47ab-baff-e007b39fc1d0-kube-api-access-kk9p2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-tkz82\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.329628 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.702518 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82"] Dec 02 08:19:21 crc kubenswrapper[4691]: I1202 08:19:21.869139 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" event={"ID":"87c36120-368b-47ab-baff-e007b39fc1d0","Type":"ContainerStarted","Data":"970789950d30828a159272e46a66b693a0e0ed1d95621a88a36e704de10c803e"} Dec 02 08:19:22 crc kubenswrapper[4691]: I1202 08:19:22.900231 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" event={"ID":"87c36120-368b-47ab-baff-e007b39fc1d0","Type":"ContainerStarted","Data":"c9392b389cd115cce127ef2fb659d98f6f909ce35b5cd8f1a6249292fa7afcda"} Dec 02 08:19:22 crc kubenswrapper[4691]: I1202 08:19:22.918662 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" podStartSLOduration=2.475735002 podStartE2EDuration="2.918642614s" podCreationTimestamp="2025-12-02 08:19:20 +0000 UTC" firstStartedPulling="2025-12-02 08:19:21.710277311 +0000 UTC m=+2009.494356173" lastFinishedPulling="2025-12-02 08:19:22.153184923 +0000 UTC m=+2009.937263785" observedRunningTime="2025-12-02 08:19:22.915671151 +0000 UTC m=+2010.699750013" watchObservedRunningTime="2025-12-02 08:19:22.918642614 +0000 UTC m=+2010.702721476" Dec 02 08:20:24 crc kubenswrapper[4691]: I1202 08:20:24.613377 4691 generic.go:334] "Generic (PLEG): container finished" podID="87c36120-368b-47ab-baff-e007b39fc1d0" containerID="c9392b389cd115cce127ef2fb659d98f6f909ce35b5cd8f1a6249292fa7afcda" exitCode=0 Dec 02 08:20:24 crc kubenswrapper[4691]: I1202 08:20:24.613922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" event={"ID":"87c36120-368b-47ab-baff-e007b39fc1d0","Type":"ContainerDied","Data":"c9392b389cd115cce127ef2fb659d98f6f909ce35b5cd8f1a6249292fa7afcda"} Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.062203 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.223123 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-inventory\") pod \"87c36120-368b-47ab-baff-e007b39fc1d0\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.223201 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/87c36120-368b-47ab-baff-e007b39fc1d0-ovncontroller-config-0\") pod \"87c36120-368b-47ab-baff-e007b39fc1d0\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.223277 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk9p2\" (UniqueName: \"kubernetes.io/projected/87c36120-368b-47ab-baff-e007b39fc1d0-kube-api-access-kk9p2\") pod \"87c36120-368b-47ab-baff-e007b39fc1d0\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.223307 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ovn-combined-ca-bundle\") pod \"87c36120-368b-47ab-baff-e007b39fc1d0\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.223380 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ssh-key\") pod \"87c36120-368b-47ab-baff-e007b39fc1d0\" (UID: \"87c36120-368b-47ab-baff-e007b39fc1d0\") " Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.230813 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87c36120-368b-47ab-baff-e007b39fc1d0-kube-api-access-kk9p2" (OuterVolumeSpecName: "kube-api-access-kk9p2") pod "87c36120-368b-47ab-baff-e007b39fc1d0" (UID: "87c36120-368b-47ab-baff-e007b39fc1d0"). InnerVolumeSpecName "kube-api-access-kk9p2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.235607 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "87c36120-368b-47ab-baff-e007b39fc1d0" (UID: "87c36120-368b-47ab-baff-e007b39fc1d0"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.250123 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87c36120-368b-47ab-baff-e007b39fc1d0-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "87c36120-368b-47ab-baff-e007b39fc1d0" (UID: "87c36120-368b-47ab-baff-e007b39fc1d0"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.253671 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87c36120-368b-47ab-baff-e007b39fc1d0" (UID: "87c36120-368b-47ab-baff-e007b39fc1d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.274022 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-inventory" (OuterVolumeSpecName: "inventory") pod "87c36120-368b-47ab-baff-e007b39fc1d0" (UID: "87c36120-368b-47ab-baff-e007b39fc1d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.325674 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.325712 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.325725 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87c36120-368b-47ab-baff-e007b39fc1d0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.325734 4691 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/87c36120-368b-47ab-baff-e007b39fc1d0-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.325742 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kk9p2\" (UniqueName: \"kubernetes.io/projected/87c36120-368b-47ab-baff-e007b39fc1d0-kube-api-access-kk9p2\") on node \"crc\" DevicePath \"\"" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.649602 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" event={"ID":"87c36120-368b-47ab-baff-e007b39fc1d0","Type":"ContainerDied","Data":"970789950d30828a159272e46a66b693a0e0ed1d95621a88a36e704de10c803e"} Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.649652 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="970789950d30828a159272e46a66b693a0e0ed1d95621a88a36e704de10c803e" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.649692 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-tkz82" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.752415 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp"] Dec 02 08:20:26 crc kubenswrapper[4691]: E1202 08:20:26.753305 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87c36120-368b-47ab-baff-e007b39fc1d0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.753331 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="87c36120-368b-47ab-baff-e007b39fc1d0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.753548 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="87c36120-368b-47ab-baff-e007b39fc1d0" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.754322 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.760494 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.760986 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.761040 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.761389 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.762610 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.762730 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.789140 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp"] Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.939454 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crbfb\" (UniqueName: \"kubernetes.io/projected/c64d5e17-b659-47c6-aa5b-a62be849ee69-kube-api-access-crbfb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.939656 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.939779 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.939817 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.939868 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:26 crc kubenswrapper[4691]: I1202 08:20:26.939918 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.042124 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.042189 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.042243 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.042279 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.042388 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crbfb\" (UniqueName: \"kubernetes.io/projected/c64d5e17-b659-47c6-aa5b-a62be849ee69-kube-api-access-crbfb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.042446 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.048062 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.048288 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.048172 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.049069 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.049156 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.075779 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crbfb\" (UniqueName: \"kubernetes.io/projected/c64d5e17-b659-47c6-aa5b-a62be849ee69-kube-api-access-crbfb\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.086570 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:20:27 crc kubenswrapper[4691]: W1202 08:20:27.683650 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc64d5e17_b659_47c6_aa5b_a62be849ee69.slice/crio-110a6cd5d57c74887ac9d452fb2c956843caf172cde73308b35f91adad0a6c92 WatchSource:0}: Error finding container 110a6cd5d57c74887ac9d452fb2c956843caf172cde73308b35f91adad0a6c92: Status 404 returned error can't find the container with id 110a6cd5d57c74887ac9d452fb2c956843caf172cde73308b35f91adad0a6c92 Dec 02 08:20:27 crc kubenswrapper[4691]: I1202 08:20:27.683945 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp"] Dec 02 08:20:28 crc kubenswrapper[4691]: I1202 08:20:28.666986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" event={"ID":"c64d5e17-b659-47c6-aa5b-a62be849ee69","Type":"ContainerStarted","Data":"15e56742165f175053976598ed491f4fbf26be874ce77fab230fc1f903be6862"} Dec 02 08:20:28 crc kubenswrapper[4691]: I1202 08:20:28.667343 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" event={"ID":"c64d5e17-b659-47c6-aa5b-a62be849ee69","Type":"ContainerStarted","Data":"110a6cd5d57c74887ac9d452fb2c956843caf172cde73308b35f91adad0a6c92"} Dec 02 08:20:28 crc kubenswrapper[4691]: I1202 08:20:28.690463 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" podStartSLOduration=2.130281795 podStartE2EDuration="2.690443253s" podCreationTimestamp="2025-12-02 08:20:26 +0000 UTC" firstStartedPulling="2025-12-02 08:20:27.685564609 +0000 UTC m=+2075.469643471" lastFinishedPulling="2025-12-02 08:20:28.245726067 +0000 UTC m=+2076.029804929" observedRunningTime="2025-12-02 08:20:28.688610378 +0000 UTC m=+2076.472689240" watchObservedRunningTime="2025-12-02 08:20:28.690443253 +0000 UTC m=+2076.474522125" Dec 02 08:21:17 crc kubenswrapper[4691]: I1202 08:21:17.117602 4691 generic.go:334] "Generic (PLEG): container finished" podID="c64d5e17-b659-47c6-aa5b-a62be849ee69" containerID="15e56742165f175053976598ed491f4fbf26be874ce77fab230fc1f903be6862" exitCode=0 Dec 02 08:21:17 crc kubenswrapper[4691]: I1202 08:21:17.117686 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" event={"ID":"c64d5e17-b659-47c6-aa5b-a62be849ee69","Type":"ContainerDied","Data":"15e56742165f175053976598ed491f4fbf26be874ce77fab230fc1f903be6862"} Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.625147 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.642205 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-ovn-metadata-agent-neutron-config-0\") pod \"c64d5e17-b659-47c6-aa5b-a62be849ee69\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.642379 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-inventory\") pod \"c64d5e17-b659-47c6-aa5b-a62be849ee69\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.642450 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-nova-metadata-neutron-config-0\") pod \"c64d5e17-b659-47c6-aa5b-a62be849ee69\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.642497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crbfb\" (UniqueName: \"kubernetes.io/projected/c64d5e17-b659-47c6-aa5b-a62be849ee69-kube-api-access-crbfb\") pod \"c64d5e17-b659-47c6-aa5b-a62be849ee69\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.642583 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-ssh-key\") pod \"c64d5e17-b659-47c6-aa5b-a62be849ee69\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.642718 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-metadata-combined-ca-bundle\") pod \"c64d5e17-b659-47c6-aa5b-a62be849ee69\" (UID: \"c64d5e17-b659-47c6-aa5b-a62be849ee69\") " Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.662791 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64d5e17-b659-47c6-aa5b-a62be849ee69-kube-api-access-crbfb" (OuterVolumeSpecName: "kube-api-access-crbfb") pod "c64d5e17-b659-47c6-aa5b-a62be849ee69" (UID: "c64d5e17-b659-47c6-aa5b-a62be849ee69"). InnerVolumeSpecName "kube-api-access-crbfb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.662994 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "c64d5e17-b659-47c6-aa5b-a62be849ee69" (UID: "c64d5e17-b659-47c6-aa5b-a62be849ee69"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.672347 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "c64d5e17-b659-47c6-aa5b-a62be849ee69" (UID: "c64d5e17-b659-47c6-aa5b-a62be849ee69"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.674686 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-inventory" (OuterVolumeSpecName: "inventory") pod "c64d5e17-b659-47c6-aa5b-a62be849ee69" (UID: "c64d5e17-b659-47c6-aa5b-a62be849ee69"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.675085 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "c64d5e17-b659-47c6-aa5b-a62be849ee69" (UID: "c64d5e17-b659-47c6-aa5b-a62be849ee69"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.697252 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c64d5e17-b659-47c6-aa5b-a62be849ee69" (UID: "c64d5e17-b659-47c6-aa5b-a62be849ee69"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.746299 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.746343 4691 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.746358 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crbfb\" (UniqueName: \"kubernetes.io/projected/c64d5e17-b659-47c6-aa5b-a62be849ee69-kube-api-access-crbfb\") on node \"crc\" DevicePath \"\"" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.746372 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.746383 4691 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:21:18 crc kubenswrapper[4691]: I1202 08:21:18.746416 4691 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/c64d5e17-b659-47c6-aa5b-a62be849ee69-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.139379 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" event={"ID":"c64d5e17-b659-47c6-aa5b-a62be849ee69","Type":"ContainerDied","Data":"110a6cd5d57c74887ac9d452fb2c956843caf172cde73308b35f91adad0a6c92"} Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.139427 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="110a6cd5d57c74887ac9d452fb2c956843caf172cde73308b35f91adad0a6c92" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.139456 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.227570 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb"] Dec 02 08:21:19 crc kubenswrapper[4691]: E1202 08:21:19.227969 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64d5e17-b659-47c6-aa5b-a62be849ee69" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.227989 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64d5e17-b659-47c6-aa5b-a62be849ee69" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.228193 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64d5e17-b659-47c6-aa5b-a62be849ee69" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.228901 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.230934 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.231471 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.232480 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.232686 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.238830 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.255824 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.255985 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.256146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.256476 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xdvk\" (UniqueName: \"kubernetes.io/projected/d1c6d92a-1daf-4554-822b-1c946124e1d0-kube-api-access-5xdvk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.256677 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.282695 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb"] Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.359000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xdvk\" (UniqueName: \"kubernetes.io/projected/d1c6d92a-1daf-4554-822b-1c946124e1d0-kube-api-access-5xdvk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.359123 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.359187 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.359231 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.359291 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.364679 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.365027 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.366595 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.370588 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.390407 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xdvk\" (UniqueName: \"kubernetes.io/projected/d1c6d92a-1daf-4554-822b-1c946124e1d0-kube-api-access-5xdvk\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:19 crc kubenswrapper[4691]: I1202 08:21:19.548220 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:21:20 crc kubenswrapper[4691]: I1202 08:21:20.089116 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb"] Dec 02 08:21:20 crc kubenswrapper[4691]: I1202 08:21:20.148641 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" event={"ID":"d1c6d92a-1daf-4554-822b-1c946124e1d0","Type":"ContainerStarted","Data":"a3379298b886bde9919e48dcf74ab2fc825577327036581fccf7f1858f884829"} Dec 02 08:21:21 crc kubenswrapper[4691]: I1202 08:21:21.158697 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" event={"ID":"d1c6d92a-1daf-4554-822b-1c946124e1d0","Type":"ContainerStarted","Data":"2aaaa423d4bd9058eecc47a0f70888c0027048d64184d69727c30775025bb6d4"} Dec 02 08:21:21 crc kubenswrapper[4691]: I1202 08:21:21.180900 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" podStartSLOduration=1.666250433 podStartE2EDuration="2.180881359s" podCreationTimestamp="2025-12-02 08:21:19 +0000 UTC" firstStartedPulling="2025-12-02 08:21:20.093424194 +0000 UTC m=+2127.877503056" lastFinishedPulling="2025-12-02 08:21:20.60805512 +0000 UTC m=+2128.392133982" observedRunningTime="2025-12-02 08:21:21.174088683 +0000 UTC m=+2128.958167535" watchObservedRunningTime="2025-12-02 08:21:21.180881359 +0000 UTC m=+2128.964960221" Dec 02 08:21:21 crc kubenswrapper[4691]: I1202 08:21:21.898637 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:21:21 crc kubenswrapper[4691]: I1202 08:21:21.898699 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.073384 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bhc22"] Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.076646 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.104212 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhc22"] Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.221234 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-catalog-content\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.221375 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-utilities\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.221437 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9m9w\" (UniqueName: \"kubernetes.io/projected/e641ac88-0669-4ace-af9d-252fcd0022ec-kube-api-access-b9m9w\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.323471 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-utilities\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.323559 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9m9w\" (UniqueName: \"kubernetes.io/projected/e641ac88-0669-4ace-af9d-252fcd0022ec-kube-api-access-b9m9w\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.323750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-catalog-content\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.324144 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-utilities\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.324385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-catalog-content\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.350783 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9m9w\" (UniqueName: \"kubernetes.io/projected/e641ac88-0669-4ace-af9d-252fcd0022ec-kube-api-access-b9m9w\") pod \"redhat-operators-bhc22\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.408148 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:45 crc kubenswrapper[4691]: I1202 08:21:45.859737 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bhc22"] Dec 02 08:21:46 crc kubenswrapper[4691]: I1202 08:21:46.416924 4691 generic.go:334] "Generic (PLEG): container finished" podID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerID="3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13" exitCode=0 Dec 02 08:21:46 crc kubenswrapper[4691]: I1202 08:21:46.417021 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerDied","Data":"3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13"} Dec 02 08:21:46 crc kubenswrapper[4691]: I1202 08:21:46.417073 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerStarted","Data":"4e0f69f0d749ab6af9ce78818b8a8054311ef0f35bf6d9326536958bc8528d33"} Dec 02 08:21:46 crc kubenswrapper[4691]: I1202 08:21:46.419496 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:21:48 crc kubenswrapper[4691]: I1202 08:21:48.439930 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerStarted","Data":"c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1"} Dec 02 08:21:50 crc kubenswrapper[4691]: I1202 08:21:50.459494 4691 generic.go:334] "Generic (PLEG): container finished" podID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerID="c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1" exitCode=0 Dec 02 08:21:50 crc kubenswrapper[4691]: I1202 08:21:50.459545 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerDied","Data":"c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1"} Dec 02 08:21:51 crc kubenswrapper[4691]: I1202 08:21:51.898812 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:21:51 crc kubenswrapper[4691]: I1202 08:21:51.899285 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:21:52 crc kubenswrapper[4691]: I1202 08:21:52.479074 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerStarted","Data":"01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898"} Dec 02 08:21:52 crc kubenswrapper[4691]: I1202 08:21:52.505713 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bhc22" podStartSLOduration=2.588218072 podStartE2EDuration="7.505690267s" podCreationTimestamp="2025-12-02 08:21:45 +0000 UTC" firstStartedPulling="2025-12-02 08:21:46.419053145 +0000 UTC m=+2154.203132007" lastFinishedPulling="2025-12-02 08:21:51.33652534 +0000 UTC m=+2159.120604202" observedRunningTime="2025-12-02 08:21:52.501673446 +0000 UTC m=+2160.285752318" watchObservedRunningTime="2025-12-02 08:21:52.505690267 +0000 UTC m=+2160.289769129" Dec 02 08:21:55 crc kubenswrapper[4691]: I1202 08:21:55.409160 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:55 crc kubenswrapper[4691]: I1202 08:21:55.409702 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:21:56 crc kubenswrapper[4691]: I1202 08:21:56.458694 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bhc22" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="registry-server" probeResult="failure" output=< Dec 02 08:21:56 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 08:21:56 crc kubenswrapper[4691]: > Dec 02 08:22:05 crc kubenswrapper[4691]: I1202 08:22:05.458688 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:22:05 crc kubenswrapper[4691]: I1202 08:22:05.512178 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:22:05 crc kubenswrapper[4691]: I1202 08:22:05.699991 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhc22"] Dec 02 08:22:06 crc kubenswrapper[4691]: I1202 08:22:06.620973 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bhc22" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="registry-server" containerID="cri-o://01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898" gracePeriod=2 Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.087721 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.167878 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-catalog-content\") pod \"e641ac88-0669-4ace-af9d-252fcd0022ec\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.167968 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9m9w\" (UniqueName: \"kubernetes.io/projected/e641ac88-0669-4ace-af9d-252fcd0022ec-kube-api-access-b9m9w\") pod \"e641ac88-0669-4ace-af9d-252fcd0022ec\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.168190 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-utilities\") pod \"e641ac88-0669-4ace-af9d-252fcd0022ec\" (UID: \"e641ac88-0669-4ace-af9d-252fcd0022ec\") " Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.169146 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-utilities" (OuterVolumeSpecName: "utilities") pod "e641ac88-0669-4ace-af9d-252fcd0022ec" (UID: "e641ac88-0669-4ace-af9d-252fcd0022ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.174503 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e641ac88-0669-4ace-af9d-252fcd0022ec-kube-api-access-b9m9w" (OuterVolumeSpecName: "kube-api-access-b9m9w") pod "e641ac88-0669-4ace-af9d-252fcd0022ec" (UID: "e641ac88-0669-4ace-af9d-252fcd0022ec"). InnerVolumeSpecName "kube-api-access-b9m9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.271320 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9m9w\" (UniqueName: \"kubernetes.io/projected/e641ac88-0669-4ace-af9d-252fcd0022ec-kube-api-access-b9m9w\") on node \"crc\" DevicePath \"\"" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.271360 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.281514 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e641ac88-0669-4ace-af9d-252fcd0022ec" (UID: "e641ac88-0669-4ace-af9d-252fcd0022ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.373314 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e641ac88-0669-4ace-af9d-252fcd0022ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.631603 4691 generic.go:334] "Generic (PLEG): container finished" podID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerID="01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898" exitCode=0 Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.631708 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bhc22" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.632635 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerDied","Data":"01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898"} Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.632815 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bhc22" event={"ID":"e641ac88-0669-4ace-af9d-252fcd0022ec","Type":"ContainerDied","Data":"4e0f69f0d749ab6af9ce78818b8a8054311ef0f35bf6d9326536958bc8528d33"} Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.632909 4691 scope.go:117] "RemoveContainer" containerID="01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.653624 4691 scope.go:117] "RemoveContainer" containerID="c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.669184 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bhc22"] Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.678583 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bhc22"] Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.701475 4691 scope.go:117] "RemoveContainer" containerID="3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.728080 4691 scope.go:117] "RemoveContainer" containerID="01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898" Dec 02 08:22:07 crc kubenswrapper[4691]: E1202 08:22:07.728671 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898\": container with ID starting with 01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898 not found: ID does not exist" containerID="01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.728702 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898"} err="failed to get container status \"01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898\": rpc error: code = NotFound desc = could not find container \"01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898\": container with ID starting with 01315e5a3d6ddaf127a884eaecc61e2507b678478bc8438d3d25748e07ce1898 not found: ID does not exist" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.728723 4691 scope.go:117] "RemoveContainer" containerID="c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1" Dec 02 08:22:07 crc kubenswrapper[4691]: E1202 08:22:07.729133 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1\": container with ID starting with c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1 not found: ID does not exist" containerID="c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.729174 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1"} err="failed to get container status \"c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1\": rpc error: code = NotFound desc = could not find container \"c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1\": container with ID starting with c3f27cf7cad03ef58593980308f12fdd95d0687c88abeb2ce6b3d5baa119edc1 not found: ID does not exist" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.729209 4691 scope.go:117] "RemoveContainer" containerID="3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13" Dec 02 08:22:07 crc kubenswrapper[4691]: E1202 08:22:07.731686 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13\": container with ID starting with 3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13 not found: ID does not exist" containerID="3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13" Dec 02 08:22:07 crc kubenswrapper[4691]: I1202 08:22:07.731725 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13"} err="failed to get container status \"3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13\": rpc error: code = NotFound desc = could not find container \"3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13\": container with ID starting with 3cac100deb98cd80e07e969cd313a67eb1f816eb1b29d5ea9c8b31d78567aa13 not found: ID does not exist" Dec 02 08:22:08 crc kubenswrapper[4691]: I1202 08:22:08.573078 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" path="/var/lib/kubelet/pods/e641ac88-0669-4ace-af9d-252fcd0022ec/volumes" Dec 02 08:22:21 crc kubenswrapper[4691]: I1202 08:22:21.898707 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:22:21 crc kubenswrapper[4691]: I1202 08:22:21.899733 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:22:21 crc kubenswrapper[4691]: I1202 08:22:21.899826 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:22:21 crc kubenswrapper[4691]: I1202 08:22:21.900939 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"76431cd8b2e3d8ce369a661c072e2a35d2afb457a7614f3fda9fe111278924cc"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:22:21 crc kubenswrapper[4691]: I1202 08:22:21.901031 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://76431cd8b2e3d8ce369a661c072e2a35d2afb457a7614f3fda9fe111278924cc" gracePeriod=600 Dec 02 08:22:22 crc kubenswrapper[4691]: I1202 08:22:22.778998 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="76431cd8b2e3d8ce369a661c072e2a35d2afb457a7614f3fda9fe111278924cc" exitCode=0 Dec 02 08:22:22 crc kubenswrapper[4691]: I1202 08:22:22.779059 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"76431cd8b2e3d8ce369a661c072e2a35d2afb457a7614f3fda9fe111278924cc"} Dec 02 08:22:22 crc kubenswrapper[4691]: I1202 08:22:22.779540 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d"} Dec 02 08:22:22 crc kubenswrapper[4691]: I1202 08:22:22.779560 4691 scope.go:117] "RemoveContainer" containerID="33421f48e3190497c2d35d3dc45d26581dd9a2eb0a75e94f7ce101f34a62d5a4" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.491990 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dfb4m"] Dec 02 08:22:54 crc kubenswrapper[4691]: E1202 08:22:54.493031 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="extract-utilities" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.493046 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="extract-utilities" Dec 02 08:22:54 crc kubenswrapper[4691]: E1202 08:22:54.493074 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="extract-content" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.493081 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="extract-content" Dec 02 08:22:54 crc kubenswrapper[4691]: E1202 08:22:54.493104 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="registry-server" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.493128 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="registry-server" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.493378 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e641ac88-0669-4ace-af9d-252fcd0022ec" containerName="registry-server" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.495199 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.502878 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfb4m"] Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.668042 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-catalog-content\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.668149 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n28f6\" (UniqueName: \"kubernetes.io/projected/a1b9f587-2b06-4517-9181-29cc93ddcac6-kube-api-access-n28f6\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.668227 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-utilities\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.770343 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n28f6\" (UniqueName: \"kubernetes.io/projected/a1b9f587-2b06-4517-9181-29cc93ddcac6-kube-api-access-n28f6\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.770517 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-utilities\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.770672 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-catalog-content\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.771274 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-utilities\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.771422 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-catalog-content\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.794514 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n28f6\" (UniqueName: \"kubernetes.io/projected/a1b9f587-2b06-4517-9181-29cc93ddcac6-kube-api-access-n28f6\") pod \"certified-operators-dfb4m\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:54 crc kubenswrapper[4691]: I1202 08:22:54.822137 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.067071 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshw"] Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.080095 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.117619 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshw"] Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.216953 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x89rx\" (UniqueName: \"kubernetes.io/projected/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-kube-api-access-x89rx\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.217059 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-catalog-content\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.217089 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-utilities\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.319609 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-catalog-content\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.319685 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-utilities\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.319868 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x89rx\" (UniqueName: \"kubernetes.io/projected/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-kube-api-access-x89rx\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.320482 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-catalog-content\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.320553 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-utilities\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.344927 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x89rx\" (UniqueName: \"kubernetes.io/projected/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-kube-api-access-x89rx\") pod \"redhat-marketplace-jxshw\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.403241 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dfb4m"] Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.425931 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:22:55 crc kubenswrapper[4691]: I1202 08:22:55.921819 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshw"] Dec 02 08:22:56 crc kubenswrapper[4691]: I1202 08:22:56.115253 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerStarted","Data":"042f95bbb90db43342d447efe2ab0626cfda2035dbd0a693058b102eacef6141"} Dec 02 08:22:56 crc kubenswrapper[4691]: I1202 08:22:56.117417 4691 generic.go:334] "Generic (PLEG): container finished" podID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerID="145c19d6417e3a859de67aab972218aea2481b9e978fc3d7f9ddf4b4835a93c2" exitCode=0 Dec 02 08:22:56 crc kubenswrapper[4691]: I1202 08:22:56.117480 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfb4m" event={"ID":"a1b9f587-2b06-4517-9181-29cc93ddcac6","Type":"ContainerDied","Data":"145c19d6417e3a859de67aab972218aea2481b9e978fc3d7f9ddf4b4835a93c2"} Dec 02 08:22:56 crc kubenswrapper[4691]: I1202 08:22:56.117558 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfb4m" event={"ID":"a1b9f587-2b06-4517-9181-29cc93ddcac6","Type":"ContainerStarted","Data":"bc1a9484686a48a4212b2aebf33c1d86db7e96595f726ba23c12153498723b2d"} Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.129440 4691 generic.go:334] "Generic (PLEG): container finished" podID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerID="0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962" exitCode=0 Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.129682 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerDied","Data":"0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962"} Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.136085 4691 generic.go:334] "Generic (PLEG): container finished" podID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerID="41445823e5f699daf544f58782b05c5006b61ca5d080dc0e6c8bc78962f88f96" exitCode=0 Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.136153 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfb4m" event={"ID":"a1b9f587-2b06-4517-9181-29cc93ddcac6","Type":"ContainerDied","Data":"41445823e5f699daf544f58782b05c5006b61ca5d080dc0e6c8bc78962f88f96"} Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.462297 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n4lg6"] Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.484839 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.515315 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4lg6"] Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.587211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-utilities\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.587605 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-catalog-content\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.587656 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spdgl\" (UniqueName: \"kubernetes.io/projected/f47be358-b4c8-44d2-a9db-d8c7fb428f49-kube-api-access-spdgl\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.689794 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-utilities\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.689924 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-catalog-content\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.689975 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spdgl\" (UniqueName: \"kubernetes.io/projected/f47be358-b4c8-44d2-a9db-d8c7fb428f49-kube-api-access-spdgl\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.690266 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-utilities\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.690784 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-catalog-content\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.712264 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spdgl\" (UniqueName: \"kubernetes.io/projected/f47be358-b4c8-44d2-a9db-d8c7fb428f49-kube-api-access-spdgl\") pod \"community-operators-n4lg6\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:57 crc kubenswrapper[4691]: I1202 08:22:57.818153 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:22:58 crc kubenswrapper[4691]: I1202 08:22:58.162983 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerStarted","Data":"c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921"} Dec 02 08:22:58 crc kubenswrapper[4691]: I1202 08:22:58.169902 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfb4m" event={"ID":"a1b9f587-2b06-4517-9181-29cc93ddcac6","Type":"ContainerStarted","Data":"8d989cfbdce2f1ee21960212b077fbbae1e769a2036fda73b95c83e88b234003"} Dec 02 08:22:58 crc kubenswrapper[4691]: I1202 08:22:58.227817 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n4lg6"] Dec 02 08:22:58 crc kubenswrapper[4691]: I1202 08:22:58.234259 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dfb4m" podStartSLOduration=2.6143667649999998 podStartE2EDuration="4.234238903s" podCreationTimestamp="2025-12-02 08:22:54 +0000 UTC" firstStartedPulling="2025-12-02 08:22:56.119092922 +0000 UTC m=+2223.903171784" lastFinishedPulling="2025-12-02 08:22:57.73896506 +0000 UTC m=+2225.523043922" observedRunningTime="2025-12-02 08:22:58.21649432 +0000 UTC m=+2226.000573202" watchObservedRunningTime="2025-12-02 08:22:58.234238903 +0000 UTC m=+2226.018317765" Dec 02 08:22:58 crc kubenswrapper[4691]: W1202 08:22:58.246084 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf47be358_b4c8_44d2_a9db_d8c7fb428f49.slice/crio-03a6c7a854959a7ab9668d07d1074743f6d37ff4d39ddb30a25c15177838d0f6 WatchSource:0}: Error finding container 03a6c7a854959a7ab9668d07d1074743f6d37ff4d39ddb30a25c15177838d0f6: Status 404 returned error can't find the container with id 03a6c7a854959a7ab9668d07d1074743f6d37ff4d39ddb30a25c15177838d0f6 Dec 02 08:22:59 crc kubenswrapper[4691]: I1202 08:22:59.184438 4691 generic.go:334] "Generic (PLEG): container finished" podID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerID="c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921" exitCode=0 Dec 02 08:22:59 crc kubenswrapper[4691]: I1202 08:22:59.184550 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerDied","Data":"c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921"} Dec 02 08:22:59 crc kubenswrapper[4691]: I1202 08:22:59.188295 4691 generic.go:334] "Generic (PLEG): container finished" podID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerID="22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c" exitCode=0 Dec 02 08:22:59 crc kubenswrapper[4691]: I1202 08:22:59.188362 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerDied","Data":"22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c"} Dec 02 08:22:59 crc kubenswrapper[4691]: I1202 08:22:59.188399 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerStarted","Data":"03a6c7a854959a7ab9668d07d1074743f6d37ff4d39ddb30a25c15177838d0f6"} Dec 02 08:23:00 crc kubenswrapper[4691]: I1202 08:23:00.199283 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerStarted","Data":"bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea"} Dec 02 08:23:00 crc kubenswrapper[4691]: I1202 08:23:00.202818 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerStarted","Data":"c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409"} Dec 02 08:23:00 crc kubenswrapper[4691]: I1202 08:23:00.223324 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxshw" podStartSLOduration=2.683815233 podStartE2EDuration="5.223300136s" podCreationTimestamp="2025-12-02 08:22:55 +0000 UTC" firstStartedPulling="2025-12-02 08:22:57.132980102 +0000 UTC m=+2224.917058974" lastFinishedPulling="2025-12-02 08:22:59.672465015 +0000 UTC m=+2227.456543877" observedRunningTime="2025-12-02 08:23:00.216922467 +0000 UTC m=+2228.001001349" watchObservedRunningTime="2025-12-02 08:23:00.223300136 +0000 UTC m=+2228.007378998" Dec 02 08:23:01 crc kubenswrapper[4691]: I1202 08:23:01.212908 4691 generic.go:334] "Generic (PLEG): container finished" podID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerID="c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409" exitCode=0 Dec 02 08:23:01 crc kubenswrapper[4691]: I1202 08:23:01.213005 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerDied","Data":"c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409"} Dec 02 08:23:03 crc kubenswrapper[4691]: I1202 08:23:03.238433 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerStarted","Data":"ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5"} Dec 02 08:23:03 crc kubenswrapper[4691]: I1202 08:23:03.265833 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n4lg6" podStartSLOduration=3.427972946 podStartE2EDuration="6.265809237s" podCreationTimestamp="2025-12-02 08:22:57 +0000 UTC" firstStartedPulling="2025-12-02 08:22:59.189733375 +0000 UTC m=+2226.973812237" lastFinishedPulling="2025-12-02 08:23:02.027569666 +0000 UTC m=+2229.811648528" observedRunningTime="2025-12-02 08:23:03.254739871 +0000 UTC m=+2231.038818753" watchObservedRunningTime="2025-12-02 08:23:03.265809237 +0000 UTC m=+2231.049888089" Dec 02 08:23:04 crc kubenswrapper[4691]: I1202 08:23:04.823176 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:23:04 crc kubenswrapper[4691]: I1202 08:23:04.823497 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:23:04 crc kubenswrapper[4691]: I1202 08:23:04.869394 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:23:05 crc kubenswrapper[4691]: I1202 08:23:05.301749 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:23:05 crc kubenswrapper[4691]: I1202 08:23:05.427003 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:23:05 crc kubenswrapper[4691]: I1202 08:23:05.428285 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:23:05 crc kubenswrapper[4691]: I1202 08:23:05.478103 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:23:06 crc kubenswrapper[4691]: I1202 08:23:06.319984 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:23:06 crc kubenswrapper[4691]: I1202 08:23:06.852548 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfb4m"] Dec 02 08:23:07 crc kubenswrapper[4691]: I1202 08:23:07.274428 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dfb4m" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="registry-server" containerID="cri-o://8d989cfbdce2f1ee21960212b077fbbae1e769a2036fda73b95c83e88b234003" gracePeriod=2 Dec 02 08:23:07 crc kubenswrapper[4691]: I1202 08:23:07.819286 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:23:07 crc kubenswrapper[4691]: I1202 08:23:07.821065 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:23:07 crc kubenswrapper[4691]: I1202 08:23:07.855219 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshw"] Dec 02 08:23:07 crc kubenswrapper[4691]: I1202 08:23:07.870062 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.287614 4691 generic.go:334] "Generic (PLEG): container finished" podID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerID="8d989cfbdce2f1ee21960212b077fbbae1e769a2036fda73b95c83e88b234003" exitCode=0 Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.287727 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfb4m" event={"ID":"a1b9f587-2b06-4517-9181-29cc93ddcac6","Type":"ContainerDied","Data":"8d989cfbdce2f1ee21960212b077fbbae1e769a2036fda73b95c83e88b234003"} Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.334047 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.850874 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.944930 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-catalog-content\") pod \"a1b9f587-2b06-4517-9181-29cc93ddcac6\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.945356 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n28f6\" (UniqueName: \"kubernetes.io/projected/a1b9f587-2b06-4517-9181-29cc93ddcac6-kube-api-access-n28f6\") pod \"a1b9f587-2b06-4517-9181-29cc93ddcac6\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.945454 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-utilities\") pod \"a1b9f587-2b06-4517-9181-29cc93ddcac6\" (UID: \"a1b9f587-2b06-4517-9181-29cc93ddcac6\") " Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.946414 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-utilities" (OuterVolumeSpecName: "utilities") pod "a1b9f587-2b06-4517-9181-29cc93ddcac6" (UID: "a1b9f587-2b06-4517-9181-29cc93ddcac6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.952030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b9f587-2b06-4517-9181-29cc93ddcac6-kube-api-access-n28f6" (OuterVolumeSpecName: "kube-api-access-n28f6") pod "a1b9f587-2b06-4517-9181-29cc93ddcac6" (UID: "a1b9f587-2b06-4517-9181-29cc93ddcac6"). InnerVolumeSpecName "kube-api-access-n28f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:23:08 crc kubenswrapper[4691]: I1202 08:23:08.996535 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a1b9f587-2b06-4517-9181-29cc93ddcac6" (UID: "a1b9f587-2b06-4517-9181-29cc93ddcac6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.047905 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.047937 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a1b9f587-2b06-4517-9181-29cc93ddcac6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.047951 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n28f6\" (UniqueName: \"kubernetes.io/projected/a1b9f587-2b06-4517-9181-29cc93ddcac6-kube-api-access-n28f6\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.299559 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dfb4m" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.299555 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dfb4m" event={"ID":"a1b9f587-2b06-4517-9181-29cc93ddcac6","Type":"ContainerDied","Data":"bc1a9484686a48a4212b2aebf33c1d86db7e96595f726ba23c12153498723b2d"} Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.299648 4691 scope.go:117] "RemoveContainer" containerID="8d989cfbdce2f1ee21960212b077fbbae1e769a2036fda73b95c83e88b234003" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.300144 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxshw" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="registry-server" containerID="cri-o://bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea" gracePeriod=2 Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.334591 4691 scope.go:117] "RemoveContainer" containerID="41445823e5f699daf544f58782b05c5006b61ca5d080dc0e6c8bc78962f88f96" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.340661 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dfb4m"] Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.350839 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dfb4m"] Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.363743 4691 scope.go:117] "RemoveContainer" containerID="145c19d6417e3a859de67aab972218aea2481b9e978fc3d7f9ddf4b4835a93c2" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.683709 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.761313 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-catalog-content\") pod \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.761497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-utilities\") pod \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.761532 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x89rx\" (UniqueName: \"kubernetes.io/projected/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-kube-api-access-x89rx\") pod \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\" (UID: \"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8\") " Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.762595 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-utilities" (OuterVolumeSpecName: "utilities") pod "5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" (UID: "5c31ac24-b764-4e8b-bfd8-2b25ab60dee8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.767534 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-kube-api-access-x89rx" (OuterVolumeSpecName: "kube-api-access-x89rx") pod "5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" (UID: "5c31ac24-b764-4e8b-bfd8-2b25ab60dee8"). InnerVolumeSpecName "kube-api-access-x89rx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.782946 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" (UID: "5c31ac24-b764-4e8b-bfd8-2b25ab60dee8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.863987 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.864031 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x89rx\" (UniqueName: \"kubernetes.io/projected/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-kube-api-access-x89rx\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:09 crc kubenswrapper[4691]: I1202 08:23:09.864044 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.311211 4691 generic.go:334] "Generic (PLEG): container finished" podID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerID="bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea" exitCode=0 Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.311599 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerDied","Data":"bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea"} Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.311661 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxshw" event={"ID":"5c31ac24-b764-4e8b-bfd8-2b25ab60dee8","Type":"ContainerDied","Data":"042f95bbb90db43342d447efe2ab0626cfda2035dbd0a693058b102eacef6141"} Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.311681 4691 scope.go:117] "RemoveContainer" containerID="bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.311752 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxshw" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.349951 4691 scope.go:117] "RemoveContainer" containerID="c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.364601 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshw"] Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.380475 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxshw"] Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.388180 4691 scope.go:117] "RemoveContainer" containerID="0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.419143 4691 scope.go:117] "RemoveContainer" containerID="bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea" Dec 02 08:23:10 crc kubenswrapper[4691]: E1202 08:23:10.419747 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea\": container with ID starting with bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea not found: ID does not exist" containerID="bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.419871 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea"} err="failed to get container status \"bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea\": rpc error: code = NotFound desc = could not find container \"bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea\": container with ID starting with bac1295b1a2aa9e49422a0255c1bdc776860a183600bdb9d63c4f861f33529ea not found: ID does not exist" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.419900 4691 scope.go:117] "RemoveContainer" containerID="c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921" Dec 02 08:23:10 crc kubenswrapper[4691]: E1202 08:23:10.420333 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921\": container with ID starting with c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921 not found: ID does not exist" containerID="c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.420358 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921"} err="failed to get container status \"c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921\": rpc error: code = NotFound desc = could not find container \"c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921\": container with ID starting with c78f20f029b6dfa782ffdf483aadf108d0b2932b8e3b835c0fbf106692579921 not found: ID does not exist" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.420393 4691 scope.go:117] "RemoveContainer" containerID="0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962" Dec 02 08:23:10 crc kubenswrapper[4691]: E1202 08:23:10.420743 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962\": container with ID starting with 0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962 not found: ID does not exist" containerID="0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.420811 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962"} err="failed to get container status \"0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962\": rpc error: code = NotFound desc = could not find container \"0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962\": container with ID starting with 0247d5f08ad9bdf565082c7ceef71b57b83e15dbe3872032e1b260af065a5962 not found: ID does not exist" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.583209 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" path="/var/lib/kubelet/pods/5c31ac24-b764-4e8b-bfd8-2b25ab60dee8/volumes" Dec 02 08:23:10 crc kubenswrapper[4691]: I1202 08:23:10.583885 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" path="/var/lib/kubelet/pods/a1b9f587-2b06-4517-9181-29cc93ddcac6/volumes" Dec 02 08:23:11 crc kubenswrapper[4691]: I1202 08:23:11.653962 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4lg6"] Dec 02 08:23:11 crc kubenswrapper[4691]: I1202 08:23:11.655745 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n4lg6" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="registry-server" containerID="cri-o://ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5" gracePeriod=2 Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.120749 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.274497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-utilities\") pod \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.274676 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spdgl\" (UniqueName: \"kubernetes.io/projected/f47be358-b4c8-44d2-a9db-d8c7fb428f49-kube-api-access-spdgl\") pod \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.274732 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-catalog-content\") pod \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\" (UID: \"f47be358-b4c8-44d2-a9db-d8c7fb428f49\") " Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.275898 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-utilities" (OuterVolumeSpecName: "utilities") pod "f47be358-b4c8-44d2-a9db-d8c7fb428f49" (UID: "f47be358-b4c8-44d2-a9db-d8c7fb428f49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.280196 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f47be358-b4c8-44d2-a9db-d8c7fb428f49-kube-api-access-spdgl" (OuterVolumeSpecName: "kube-api-access-spdgl") pod "f47be358-b4c8-44d2-a9db-d8c7fb428f49" (UID: "f47be358-b4c8-44d2-a9db-d8c7fb428f49"). InnerVolumeSpecName "kube-api-access-spdgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.330777 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f47be358-b4c8-44d2-a9db-d8c7fb428f49" (UID: "f47be358-b4c8-44d2-a9db-d8c7fb428f49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.337290 4691 generic.go:334] "Generic (PLEG): container finished" podID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerID="ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5" exitCode=0 Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.337331 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerDied","Data":"ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5"} Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.337364 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n4lg6" event={"ID":"f47be358-b4c8-44d2-a9db-d8c7fb428f49","Type":"ContainerDied","Data":"03a6c7a854959a7ab9668d07d1074743f6d37ff4d39ddb30a25c15177838d0f6"} Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.337363 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n4lg6" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.337381 4691 scope.go:117] "RemoveContainer" containerID="ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.362785 4691 scope.go:117] "RemoveContainer" containerID="c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.377233 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.377460 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spdgl\" (UniqueName: \"kubernetes.io/projected/f47be358-b4c8-44d2-a9db-d8c7fb428f49-kube-api-access-spdgl\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.377556 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f47be358-b4c8-44d2-a9db-d8c7fb428f49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.383168 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n4lg6"] Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.393870 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n4lg6"] Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.409726 4691 scope.go:117] "RemoveContainer" containerID="22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.443864 4691 scope.go:117] "RemoveContainer" containerID="ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5" Dec 02 08:23:12 crc kubenswrapper[4691]: E1202 08:23:12.444290 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5\": container with ID starting with ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5 not found: ID does not exist" containerID="ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.444344 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5"} err="failed to get container status \"ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5\": rpc error: code = NotFound desc = could not find container \"ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5\": container with ID starting with ba7b1006bf775795890854615a91bc0331703f53f7c21d1ce648c461d1bacef5 not found: ID does not exist" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.444373 4691 scope.go:117] "RemoveContainer" containerID="c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409" Dec 02 08:23:12 crc kubenswrapper[4691]: E1202 08:23:12.444962 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409\": container with ID starting with c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409 not found: ID does not exist" containerID="c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.445026 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409"} err="failed to get container status \"c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409\": rpc error: code = NotFound desc = could not find container \"c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409\": container with ID starting with c8f8a3da2846bb32d898afb8e2fb7da42c7fbdf6439e9409da83e1e93355f409 not found: ID does not exist" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.445067 4691 scope.go:117] "RemoveContainer" containerID="22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c" Dec 02 08:23:12 crc kubenswrapper[4691]: E1202 08:23:12.445418 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c\": container with ID starting with 22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c not found: ID does not exist" containerID="22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.445449 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c"} err="failed to get container status \"22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c\": rpc error: code = NotFound desc = could not find container \"22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c\": container with ID starting with 22b631cb8047c42cc4029d15670d6612467f220cff944db53761b1d21f63ba2c not found: ID does not exist" Dec 02 08:23:12 crc kubenswrapper[4691]: I1202 08:23:12.573617 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" path="/var/lib/kubelet/pods/f47be358-b4c8-44d2-a9db-d8c7fb428f49/volumes" Dec 02 08:24:51 crc kubenswrapper[4691]: I1202 08:24:51.899162 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:24:51 crc kubenswrapper[4691]: I1202 08:24:51.899622 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:25:21 crc kubenswrapper[4691]: I1202 08:25:21.899157 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:25:21 crc kubenswrapper[4691]: I1202 08:25:21.899773 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:25:24 crc kubenswrapper[4691]: I1202 08:25:24.887596 4691 generic.go:334] "Generic (PLEG): container finished" podID="d1c6d92a-1daf-4554-822b-1c946124e1d0" containerID="2aaaa423d4bd9058eecc47a0f70888c0027048d64184d69727c30775025bb6d4" exitCode=0 Dec 02 08:25:24 crc kubenswrapper[4691]: I1202 08:25:24.887685 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" event={"ID":"d1c6d92a-1daf-4554-822b-1c946124e1d0","Type":"ContainerDied","Data":"2aaaa423d4bd9058eecc47a0f70888c0027048d64184d69727c30775025bb6d4"} Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.444278 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.597417 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xdvk\" (UniqueName: \"kubernetes.io/projected/d1c6d92a-1daf-4554-822b-1c946124e1d0-kube-api-access-5xdvk\") pod \"d1c6d92a-1daf-4554-822b-1c946124e1d0\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.597466 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-ssh-key\") pod \"d1c6d92a-1daf-4554-822b-1c946124e1d0\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.597569 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-combined-ca-bundle\") pod \"d1c6d92a-1daf-4554-822b-1c946124e1d0\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.597668 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-inventory\") pod \"d1c6d92a-1daf-4554-822b-1c946124e1d0\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.597861 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-secret-0\") pod \"d1c6d92a-1daf-4554-822b-1c946124e1d0\" (UID: \"d1c6d92a-1daf-4554-822b-1c946124e1d0\") " Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.604183 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d1c6d92a-1daf-4554-822b-1c946124e1d0" (UID: "d1c6d92a-1daf-4554-822b-1c946124e1d0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.613117 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1c6d92a-1daf-4554-822b-1c946124e1d0-kube-api-access-5xdvk" (OuterVolumeSpecName: "kube-api-access-5xdvk") pod "d1c6d92a-1daf-4554-822b-1c946124e1d0" (UID: "d1c6d92a-1daf-4554-822b-1c946124e1d0"). InnerVolumeSpecName "kube-api-access-5xdvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.629038 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "d1c6d92a-1daf-4554-822b-1c946124e1d0" (UID: "d1c6d92a-1daf-4554-822b-1c946124e1d0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.634595 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-inventory" (OuterVolumeSpecName: "inventory") pod "d1c6d92a-1daf-4554-822b-1c946124e1d0" (UID: "d1c6d92a-1daf-4554-822b-1c946124e1d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.649413 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d1c6d92a-1daf-4554-822b-1c946124e1d0" (UID: "d1c6d92a-1daf-4554-822b-1c946124e1d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.700885 4691 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.700916 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xdvk\" (UniqueName: \"kubernetes.io/projected/d1c6d92a-1daf-4554-822b-1c946124e1d0-kube-api-access-5xdvk\") on node \"crc\" DevicePath \"\"" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.700927 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.700938 4691 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.700947 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d1c6d92a-1daf-4554-822b-1c946124e1d0-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.906544 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" event={"ID":"d1c6d92a-1daf-4554-822b-1c946124e1d0","Type":"ContainerDied","Data":"a3379298b886bde9919e48dcf74ab2fc825577327036581fccf7f1858f884829"} Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.906574 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb" Dec 02 08:25:26 crc kubenswrapper[4691]: I1202 08:25:26.906603 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3379298b886bde9919e48dcf74ab2fc825577327036581fccf7f1858f884829" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.020919 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6"] Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021380 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="extract-utilities" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021395 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="extract-utilities" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021418 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="extract-content" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021424 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="extract-content" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021432 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="extract-utilities" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021438 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="extract-utilities" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021453 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1c6d92a-1daf-4554-822b-1c946124e1d0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021460 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1c6d92a-1daf-4554-822b-1c946124e1d0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021480 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021486 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021500 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="extract-content" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021507 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="extract-content" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021521 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="extract-content" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021527 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="extract-content" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021545 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021550 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021564 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021570 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: E1202 08:25:27.021577 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="extract-utilities" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021583 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="extract-utilities" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021770 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1c6d92a-1daf-4554-822b-1c946124e1d0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021788 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f47be358-b4c8-44d2-a9db-d8c7fb428f49" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021800 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b9f587-2b06-4517-9181-29cc93ddcac6" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.021815 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c31ac24-b764-4e8b-bfd8-2b25ab60dee8" containerName="registry-server" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.022524 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.026595 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.026616 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.026725 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.026855 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.027175 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.027357 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.027702 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.042212 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6"] Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.108634 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.108930 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109106 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109254 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109372 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rd97\" (UniqueName: \"kubernetes.io/projected/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-kube-api-access-5rd97\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109479 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109578 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109698 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.109862 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.211702 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.211830 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.211869 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.211890 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.212007 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.212056 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.212112 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rd97\" (UniqueName: \"kubernetes.io/projected/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-kube-api-access-5rd97\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.212166 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.212194 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.213398 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.215814 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.216299 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.216527 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.217886 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.220612 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.220777 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.221097 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.237172 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rd97\" (UniqueName: \"kubernetes.io/projected/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-kube-api-access-5rd97\") pod \"nova-edpm-deployment-openstack-edpm-ipam-vf7w6\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.378244 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.878432 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6"] Dec 02 08:25:27 crc kubenswrapper[4691]: W1202 08:25:27.883048 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaf13f0c_0ba4_4e4f_95cb_2de2f510801e.slice/crio-31477c4cbf143cf4312024757794bdadd321a7931badcf225ead6004c936d8b9 WatchSource:0}: Error finding container 31477c4cbf143cf4312024757794bdadd321a7931badcf225ead6004c936d8b9: Status 404 returned error can't find the container with id 31477c4cbf143cf4312024757794bdadd321a7931badcf225ead6004c936d8b9 Dec 02 08:25:27 crc kubenswrapper[4691]: I1202 08:25:27.936909 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" event={"ID":"baf13f0c-0ba4-4e4f-95cb-2de2f510801e","Type":"ContainerStarted","Data":"31477c4cbf143cf4312024757794bdadd321a7931badcf225ead6004c936d8b9"} Dec 02 08:25:28 crc kubenswrapper[4691]: I1202 08:25:28.949936 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" event={"ID":"baf13f0c-0ba4-4e4f-95cb-2de2f510801e","Type":"ContainerStarted","Data":"c94e7f32c8cbde5b56b96ea1cea18fffb5dc3ed044a9e4628d59541beb4073cf"} Dec 02 08:25:28 crc kubenswrapper[4691]: I1202 08:25:28.988711 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" podStartSLOduration=2.468319276 podStartE2EDuration="2.988686106s" podCreationTimestamp="2025-12-02 08:25:26 +0000 UTC" firstStartedPulling="2025-12-02 08:25:27.88516325 +0000 UTC m=+2375.669242102" lastFinishedPulling="2025-12-02 08:25:28.40553007 +0000 UTC m=+2376.189608932" observedRunningTime="2025-12-02 08:25:28.97881937 +0000 UTC m=+2376.762898222" watchObservedRunningTime="2025-12-02 08:25:28.988686106 +0000 UTC m=+2376.772764978" Dec 02 08:25:51 crc kubenswrapper[4691]: I1202 08:25:51.898513 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:25:51 crc kubenswrapper[4691]: I1202 08:25:51.899487 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:25:51 crc kubenswrapper[4691]: I1202 08:25:51.899545 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:25:51 crc kubenswrapper[4691]: I1202 08:25:51.900269 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:25:51 crc kubenswrapper[4691]: I1202 08:25:51.900342 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" gracePeriod=600 Dec 02 08:25:52 crc kubenswrapper[4691]: E1202 08:25:52.041488 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:25:52 crc kubenswrapper[4691]: I1202 08:25:52.156088 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" exitCode=0 Dec 02 08:25:52 crc kubenswrapper[4691]: I1202 08:25:52.156131 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d"} Dec 02 08:25:52 crc kubenswrapper[4691]: I1202 08:25:52.156171 4691 scope.go:117] "RemoveContainer" containerID="76431cd8b2e3d8ce369a661c072e2a35d2afb457a7614f3fda9fe111278924cc" Dec 02 08:25:52 crc kubenswrapper[4691]: I1202 08:25:52.157134 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:25:52 crc kubenswrapper[4691]: E1202 08:25:52.157494 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:26:03 crc kubenswrapper[4691]: I1202 08:26:03.561116 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:26:03 crc kubenswrapper[4691]: E1202 08:26:03.561971 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:26:14 crc kubenswrapper[4691]: I1202 08:26:14.561642 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:26:14 crc kubenswrapper[4691]: E1202 08:26:14.562555 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:26:29 crc kubenswrapper[4691]: I1202 08:26:29.561820 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:26:29 crc kubenswrapper[4691]: E1202 08:26:29.562683 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:26:42 crc kubenswrapper[4691]: I1202 08:26:42.575396 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:26:42 crc kubenswrapper[4691]: E1202 08:26:42.576564 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:26:55 crc kubenswrapper[4691]: I1202 08:26:55.561312 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:26:55 crc kubenswrapper[4691]: E1202 08:26:55.562302 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:27:08 crc kubenswrapper[4691]: I1202 08:27:08.562858 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:27:08 crc kubenswrapper[4691]: E1202 08:27:08.564187 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:27:23 crc kubenswrapper[4691]: I1202 08:27:23.562902 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:27:23 crc kubenswrapper[4691]: E1202 08:27:23.564215 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:27:37 crc kubenswrapper[4691]: I1202 08:27:37.561773 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:27:37 crc kubenswrapper[4691]: E1202 08:27:37.562679 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:27:52 crc kubenswrapper[4691]: I1202 08:27:52.567817 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:27:52 crc kubenswrapper[4691]: E1202 08:27:52.568367 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:28:07 crc kubenswrapper[4691]: I1202 08:28:07.562222 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:28:07 crc kubenswrapper[4691]: E1202 08:28:07.562943 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:28:08 crc kubenswrapper[4691]: I1202 08:28:08.518739 4691 generic.go:334] "Generic (PLEG): container finished" podID="baf13f0c-0ba4-4e4f-95cb-2de2f510801e" containerID="c94e7f32c8cbde5b56b96ea1cea18fffb5dc3ed044a9e4628d59541beb4073cf" exitCode=0 Dec 02 08:28:08 crc kubenswrapper[4691]: I1202 08:28:08.518806 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" event={"ID":"baf13f0c-0ba4-4e4f-95cb-2de2f510801e","Type":"ContainerDied","Data":"c94e7f32c8cbde5b56b96ea1cea18fffb5dc3ed044a9e4628d59541beb4073cf"} Dec 02 08:28:09 crc kubenswrapper[4691]: I1202 08:28:09.947011 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.046413 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-extra-config-0\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.046796 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-ssh-key\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.046989 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-combined-ca-bundle\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.047107 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-1\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.047253 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-1\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.047367 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-0\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.047488 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-0\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.047696 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-inventory\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.047861 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rd97\" (UniqueName: \"kubernetes.io/projected/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-kube-api-access-5rd97\") pod \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\" (UID: \"baf13f0c-0ba4-4e4f-95cb-2de2f510801e\") " Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.052858 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-kube-api-access-5rd97" (OuterVolumeSpecName: "kube-api-access-5rd97") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "kube-api-access-5rd97". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.066109 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.084085 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.084605 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.086889 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-inventory" (OuterVolumeSpecName: "inventory") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.089167 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.091746 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.097507 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.108473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "baf13f0c-0ba4-4e4f-95cb-2de2f510801e" (UID: "baf13f0c-0ba4-4e4f-95cb-2de2f510801e"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149168 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149212 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rd97\" (UniqueName: \"kubernetes.io/projected/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-kube-api-access-5rd97\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149226 4691 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149235 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149245 4691 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149254 4691 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149265 4691 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149275 4691 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.149287 4691 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/baf13f0c-0ba4-4e4f-95cb-2de2f510801e-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.544140 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" event={"ID":"baf13f0c-0ba4-4e4f-95cb-2de2f510801e","Type":"ContainerDied","Data":"31477c4cbf143cf4312024757794bdadd321a7931badcf225ead6004c936d8b9"} Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.544193 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31477c4cbf143cf4312024757794bdadd321a7931badcf225ead6004c936d8b9" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.544258 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-vf7w6" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.650117 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg"] Dec 02 08:28:10 crc kubenswrapper[4691]: E1202 08:28:10.650575 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf13f0c-0ba4-4e4f-95cb-2de2f510801e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.650592 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf13f0c-0ba4-4e4f-95cb-2de2f510801e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.650811 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf13f0c-0ba4-4e4f-95cb-2de2f510801e" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.651507 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.653696 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.654037 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-f5xv6" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.654580 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.654747 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.654925 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658189 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658291 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658363 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658391 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9tj5\" (UniqueName: \"kubernetes.io/projected/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-kube-api-access-n9tj5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658434 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658467 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.658529 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.660237 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg"] Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759337 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759403 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9tj5\" (UniqueName: \"kubernetes.io/projected/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-kube-api-access-n9tj5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759443 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759473 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759530 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759576 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.759639 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.763816 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.764235 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.765244 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.765459 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.767628 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.777141 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.780985 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9tj5\" (UniqueName: \"kubernetes.io/projected/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-kube-api-access-n9tj5\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:10 crc kubenswrapper[4691]: I1202 08:28:10.970140 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:28:11 crc kubenswrapper[4691]: I1202 08:28:11.539673 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg"] Dec 02 08:28:11 crc kubenswrapper[4691]: W1202 08:28:11.541307 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5fcdaa5_c1a6_4f23_b953_0d31524ee62f.slice/crio-58bd362c251b5b0c60200b6a9eff5e708f5958314a83eb998860fa4564f9e38d WatchSource:0}: Error finding container 58bd362c251b5b0c60200b6a9eff5e708f5958314a83eb998860fa4564f9e38d: Status 404 returned error can't find the container with id 58bd362c251b5b0c60200b6a9eff5e708f5958314a83eb998860fa4564f9e38d Dec 02 08:28:11 crc kubenswrapper[4691]: I1202 08:28:11.544037 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:28:11 crc kubenswrapper[4691]: I1202 08:28:11.556188 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" event={"ID":"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f","Type":"ContainerStarted","Data":"58bd362c251b5b0c60200b6a9eff5e708f5958314a83eb998860fa4564f9e38d"} Dec 02 08:28:12 crc kubenswrapper[4691]: I1202 08:28:12.577766 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" event={"ID":"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f","Type":"ContainerStarted","Data":"4f8401e96a58b0aaf5f34b05bbff9a4747d1cf52688c2be297e10b503b2c8aec"} Dec 02 08:28:12 crc kubenswrapper[4691]: I1202 08:28:12.611519 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" podStartSLOduration=1.968331549 podStartE2EDuration="2.611499622s" podCreationTimestamp="2025-12-02 08:28:10 +0000 UTC" firstStartedPulling="2025-12-02 08:28:11.543820073 +0000 UTC m=+2539.327898935" lastFinishedPulling="2025-12-02 08:28:12.186988146 +0000 UTC m=+2539.971067008" observedRunningTime="2025-12-02 08:28:12.6102548 +0000 UTC m=+2540.394333682" watchObservedRunningTime="2025-12-02 08:28:12.611499622 +0000 UTC m=+2540.395578484" Dec 02 08:28:18 crc kubenswrapper[4691]: I1202 08:28:18.562378 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:28:18 crc kubenswrapper[4691]: E1202 08:28:18.563139 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:28:31 crc kubenswrapper[4691]: I1202 08:28:31.561970 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:28:31 crc kubenswrapper[4691]: E1202 08:28:31.562890 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:28:43 crc kubenswrapper[4691]: I1202 08:28:43.561505 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:28:43 crc kubenswrapper[4691]: E1202 08:28:43.562204 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:28:56 crc kubenswrapper[4691]: I1202 08:28:56.562178 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:28:56 crc kubenswrapper[4691]: E1202 08:28:56.563049 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:29:11 crc kubenswrapper[4691]: I1202 08:29:11.561662 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:29:11 crc kubenswrapper[4691]: E1202 08:29:11.562966 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:29:26 crc kubenswrapper[4691]: I1202 08:29:26.562435 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:29:26 crc kubenswrapper[4691]: E1202 08:29:26.563217 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:29:39 crc kubenswrapper[4691]: I1202 08:29:39.561726 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:29:39 crc kubenswrapper[4691]: E1202 08:29:39.563062 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:29:51 crc kubenswrapper[4691]: I1202 08:29:51.562699 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:29:51 crc kubenswrapper[4691]: E1202 08:29:51.563461 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.155778 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf"] Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.159884 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.162951 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.163194 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.169108 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf"] Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.287718 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60a2068b-473b-4a2f-a1c1-064528e7676f-secret-volume\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.287789 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60a2068b-473b-4a2f-a1c1-064528e7676f-config-volume\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.287868 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9hc2\" (UniqueName: \"kubernetes.io/projected/60a2068b-473b-4a2f-a1c1-064528e7676f-kube-api-access-t9hc2\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.389715 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60a2068b-473b-4a2f-a1c1-064528e7676f-secret-volume\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.389791 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60a2068b-473b-4a2f-a1c1-064528e7676f-config-volume\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.389834 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9hc2\" (UniqueName: \"kubernetes.io/projected/60a2068b-473b-4a2f-a1c1-064528e7676f-kube-api-access-t9hc2\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.391141 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60a2068b-473b-4a2f-a1c1-064528e7676f-config-volume\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.395854 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60a2068b-473b-4a2f-a1c1-064528e7676f-secret-volume\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.412496 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9hc2\" (UniqueName: \"kubernetes.io/projected/60a2068b-473b-4a2f-a1c1-064528e7676f-kube-api-access-t9hc2\") pod \"collect-profiles-29411070-xk5zf\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.500783 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:00 crc kubenswrapper[4691]: I1202 08:30:00.950183 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf"] Dec 02 08:30:01 crc kubenswrapper[4691]: I1202 08:30:01.036638 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" event={"ID":"60a2068b-473b-4a2f-a1c1-064528e7676f","Type":"ContainerStarted","Data":"3e9fe3da43d904bccc7446c8b1e490e684197bfe7007d861f223a6d4081010a2"} Dec 02 08:30:02 crc kubenswrapper[4691]: I1202 08:30:02.048943 4691 generic.go:334] "Generic (PLEG): container finished" podID="60a2068b-473b-4a2f-a1c1-064528e7676f" containerID="6627fdd84a5db6a57430c6597a9db37e158b267dcbf92651ada84e7ba1c670fb" exitCode=0 Dec 02 08:30:02 crc kubenswrapper[4691]: I1202 08:30:02.049020 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" event={"ID":"60a2068b-473b-4a2f-a1c1-064528e7676f","Type":"ContainerDied","Data":"6627fdd84a5db6a57430c6597a9db37e158b267dcbf92651ada84e7ba1c670fb"} Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.391137 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.585677 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60a2068b-473b-4a2f-a1c1-064528e7676f-secret-volume\") pod \"60a2068b-473b-4a2f-a1c1-064528e7676f\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.586032 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9hc2\" (UniqueName: \"kubernetes.io/projected/60a2068b-473b-4a2f-a1c1-064528e7676f-kube-api-access-t9hc2\") pod \"60a2068b-473b-4a2f-a1c1-064528e7676f\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.586532 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60a2068b-473b-4a2f-a1c1-064528e7676f-config-volume\") pod \"60a2068b-473b-4a2f-a1c1-064528e7676f\" (UID: \"60a2068b-473b-4a2f-a1c1-064528e7676f\") " Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.587087 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60a2068b-473b-4a2f-a1c1-064528e7676f-config-volume" (OuterVolumeSpecName: "config-volume") pod "60a2068b-473b-4a2f-a1c1-064528e7676f" (UID: "60a2068b-473b-4a2f-a1c1-064528e7676f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.588235 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/60a2068b-473b-4a2f-a1c1-064528e7676f-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.592894 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60a2068b-473b-4a2f-a1c1-064528e7676f-kube-api-access-t9hc2" (OuterVolumeSpecName: "kube-api-access-t9hc2") pod "60a2068b-473b-4a2f-a1c1-064528e7676f" (UID: "60a2068b-473b-4a2f-a1c1-064528e7676f"). InnerVolumeSpecName "kube-api-access-t9hc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.594398 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60a2068b-473b-4a2f-a1c1-064528e7676f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "60a2068b-473b-4a2f-a1c1-064528e7676f" (UID: "60a2068b-473b-4a2f-a1c1-064528e7676f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.689374 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/60a2068b-473b-4a2f-a1c1-064528e7676f-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:03 crc kubenswrapper[4691]: I1202 08:30:03.689426 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9hc2\" (UniqueName: \"kubernetes.io/projected/60a2068b-473b-4a2f-a1c1-064528e7676f-kube-api-access-t9hc2\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:04 crc kubenswrapper[4691]: I1202 08:30:04.071543 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" event={"ID":"60a2068b-473b-4a2f-a1c1-064528e7676f","Type":"ContainerDied","Data":"3e9fe3da43d904bccc7446c8b1e490e684197bfe7007d861f223a6d4081010a2"} Dec 02 08:30:04 crc kubenswrapper[4691]: I1202 08:30:04.071600 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e9fe3da43d904bccc7446c8b1e490e684197bfe7007d861f223a6d4081010a2" Dec 02 08:30:04 crc kubenswrapper[4691]: I1202 08:30:04.071600 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411070-xk5zf" Dec 02 08:30:04 crc kubenswrapper[4691]: I1202 08:30:04.470725 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j"] Dec 02 08:30:04 crc kubenswrapper[4691]: I1202 08:30:04.479078 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411025-88q9j"] Dec 02 08:30:04 crc kubenswrapper[4691]: I1202 08:30:04.578058 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="943a92a5-ba00-456a-83f4-c383e252288a" path="/var/lib/kubelet/pods/943a92a5-ba00-456a-83f4-c383e252288a/volumes" Dec 02 08:30:06 crc kubenswrapper[4691]: I1202 08:30:06.562108 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:30:06 crc kubenswrapper[4691]: E1202 08:30:06.562697 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:30:20 crc kubenswrapper[4691]: I1202 08:30:20.562447 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:30:20 crc kubenswrapper[4691]: E1202 08:30:20.563271 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:30:31 crc kubenswrapper[4691]: I1202 08:30:31.319618 4691 scope.go:117] "RemoveContainer" containerID="15ce1a00fdad12a14dc02d5b96b66ab5ee6272b87015b8a50b9639ac19175cfd" Dec 02 08:30:32 crc kubenswrapper[4691]: I1202 08:30:32.431644 4691 generic.go:334] "Generic (PLEG): container finished" podID="a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" containerID="4f8401e96a58b0aaf5f34b05bbff9a4747d1cf52688c2be297e10b503b2c8aec" exitCode=0 Dec 02 08:30:32 crc kubenswrapper[4691]: I1202 08:30:32.431731 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" event={"ID":"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f","Type":"ContainerDied","Data":"4f8401e96a58b0aaf5f34b05bbff9a4747d1cf52688c2be297e10b503b2c8aec"} Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.920150 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.953791 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-2\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.953944 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-0\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.954161 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-inventory\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.954196 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9tj5\" (UniqueName: \"kubernetes.io/projected/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-kube-api-access-n9tj5\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.954266 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-telemetry-combined-ca-bundle\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.954306 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ssh-key\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:33 crc kubenswrapper[4691]: I1202 08:30:33.954354 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-1\") pod \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\" (UID: \"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f\") " Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.004159 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-kube-api-access-n9tj5" (OuterVolumeSpecName: "kube-api-access-n9tj5") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "kube-api-access-n9tj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.004512 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.009301 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.010434 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.011554 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-inventory" (OuterVolumeSpecName: "inventory") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.011637 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.011650 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" (UID: "a5fcdaa5-c1a6-4f23-b953-0d31524ee62f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056292 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056325 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9tj5\" (UniqueName: \"kubernetes.io/projected/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-kube-api-access-n9tj5\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056363 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-inventory\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056375 4691 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056384 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056393 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.056402 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a5fcdaa5-c1a6-4f23-b953-0d31524ee62f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.453036 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" event={"ID":"a5fcdaa5-c1a6-4f23-b953-0d31524ee62f","Type":"ContainerDied","Data":"58bd362c251b5b0c60200b6a9eff5e708f5958314a83eb998860fa4564f9e38d"} Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.453078 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58bd362c251b5b0c60200b6a9eff5e708f5958314a83eb998860fa4564f9e38d" Dec 02 08:30:34 crc kubenswrapper[4691]: I1202 08:30:34.453083 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg" Dec 02 08:30:35 crc kubenswrapper[4691]: I1202 08:30:35.562679 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:30:35 crc kubenswrapper[4691]: E1202 08:30:35.563585 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:30:48 crc kubenswrapper[4691]: I1202 08:30:48.561858 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:30:48 crc kubenswrapper[4691]: E1202 08:30:48.562805 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:30:59 crc kubenswrapper[4691]: I1202 08:30:59.562334 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:31:00 crc kubenswrapper[4691]: I1202 08:31:00.758781 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"387a9f01752980f7a895280ce4d44745ff4d7bd96a061e741af6408308862c87"} Dec 02 08:31:18 crc kubenswrapper[4691]: E1202 08:31:18.244569 4691 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.222:44186->38.102.83.222:41995: write tcp 38.102.83.222:44186->38.102.83.222:41995: write: broken pipe Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.289950 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 08:31:30 crc kubenswrapper[4691]: E1202 08:31:30.290806 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60a2068b-473b-4a2f-a1c1-064528e7676f" containerName="collect-profiles" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.290820 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="60a2068b-473b-4a2f-a1c1-064528e7676f" containerName="collect-profiles" Dec 02 08:31:30 crc kubenswrapper[4691]: E1202 08:31:30.290850 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.290857 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.291058 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="60a2068b-473b-4a2f-a1c1-064528e7676f" containerName="collect-profiles" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.291078 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fcdaa5-c1a6-4f23-b953-0d31524ee62f" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.291868 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.293952 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.294183 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.294223 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.296034 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mtcjg" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.301863 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.355343 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-config-data\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.355392 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.355494 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457443 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-config-data\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457514 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457537 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457720 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457914 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.457958 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.458122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpvfm\" (UniqueName: \"kubernetes.io/projected/0d635e45-a63d-4661-9b82-b21d8ce59623-kube-api-access-hpvfm\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.458153 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.459191 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.459394 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-config-data\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.466633 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.560616 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.560665 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.560708 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.560738 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.560801 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpvfm\" (UniqueName: \"kubernetes.io/projected/0d635e45-a63d-4661-9b82-b21d8ce59623-kube-api-access-hpvfm\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.560822 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.561344 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.563128 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.563643 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.567502 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.567737 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.581968 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpvfm\" (UniqueName: \"kubernetes.io/projected/0d635e45-a63d-4661-9b82-b21d8ce59623-kube-api-access-hpvfm\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.596358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " pod="openstack/tempest-tests-tempest" Dec 02 08:31:30 crc kubenswrapper[4691]: I1202 08:31:30.652300 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 08:31:31 crc kubenswrapper[4691]: I1202 08:31:31.117580 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Dec 02 08:31:32 crc kubenswrapper[4691]: I1202 08:31:32.066860 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d635e45-a63d-4661-9b82-b21d8ce59623","Type":"ContainerStarted","Data":"eabd4661d30fb362410fd012ac9af59f51b290d43b073f41ec5130b3f1467e8d"} Dec 02 08:32:07 crc kubenswrapper[4691]: E1202 08:32:07.117492 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Dec 02 08:32:07 crc kubenswrapper[4691]: E1202 08:32:07.118304 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hpvfm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(0d635e45-a63d-4661-9b82-b21d8ce59623): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 02 08:32:07 crc kubenswrapper[4691]: E1202 08:32:07.119812 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="0d635e45-a63d-4661-9b82-b21d8ce59623" Dec 02 08:32:07 crc kubenswrapper[4691]: E1202 08:32:07.208410 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="0d635e45-a63d-4661-9b82-b21d8ce59623" Dec 02 08:32:19 crc kubenswrapper[4691]: I1202 08:32:19.079523 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Dec 02 08:32:20 crc kubenswrapper[4691]: I1202 08:32:20.326101 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d635e45-a63d-4661-9b82-b21d8ce59623","Type":"ContainerStarted","Data":"ec797487da4d91d41b1724601ab823731bfa893326761b6e060b120a234eb9f1"} Dec 02 08:32:20 crc kubenswrapper[4691]: I1202 08:32:20.346288 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.386217348 podStartE2EDuration="51.346265719s" podCreationTimestamp="2025-12-02 08:31:29 +0000 UTC" firstStartedPulling="2025-12-02 08:31:31.116406607 +0000 UTC m=+2738.900485469" lastFinishedPulling="2025-12-02 08:32:19.076454978 +0000 UTC m=+2786.860533840" observedRunningTime="2025-12-02 08:32:20.34311746 +0000 UTC m=+2788.127196332" watchObservedRunningTime="2025-12-02 08:32:20.346265719 +0000 UTC m=+2788.130344581" Dec 02 08:33:21 crc kubenswrapper[4691]: I1202 08:33:21.898900 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:33:21 crc kubenswrapper[4691]: I1202 08:33:21.899428 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.643657 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7tnqh"] Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.647655 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.658372 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7tnqh"] Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.815657 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-utilities\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.815716 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-catalog-content\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.815785 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpzzx\" (UniqueName: \"kubernetes.io/projected/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-kube-api-access-tpzzx\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.917984 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-utilities\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.918035 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-catalog-content\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.918091 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpzzx\" (UniqueName: \"kubernetes.io/projected/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-kube-api-access-tpzzx\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.918505 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-utilities\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.918590 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-catalog-content\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.939485 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpzzx\" (UniqueName: \"kubernetes.io/projected/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-kube-api-access-tpzzx\") pod \"community-operators-7tnqh\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:36 crc kubenswrapper[4691]: I1202 08:33:36.970350 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:37 crc kubenswrapper[4691]: I1202 08:33:37.753885 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7tnqh"] Dec 02 08:33:38 crc kubenswrapper[4691]: I1202 08:33:38.162304 4691 generic.go:334] "Generic (PLEG): container finished" podID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerID="2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17" exitCode=0 Dec 02 08:33:38 crc kubenswrapper[4691]: I1202 08:33:38.162368 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerDied","Data":"2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17"} Dec 02 08:33:38 crc kubenswrapper[4691]: I1202 08:33:38.162585 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerStarted","Data":"7186f601f3847d3bba54e413ec825d2bac163a81168bfdc7173a7eb17c2c58f7"} Dec 02 08:33:38 crc kubenswrapper[4691]: I1202 08:33:38.164973 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:33:39 crc kubenswrapper[4691]: I1202 08:33:39.176535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerStarted","Data":"02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840"} Dec 02 08:33:40 crc kubenswrapper[4691]: I1202 08:33:40.189303 4691 generic.go:334] "Generic (PLEG): container finished" podID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerID="02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840" exitCode=0 Dec 02 08:33:40 crc kubenswrapper[4691]: I1202 08:33:40.190577 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerDied","Data":"02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840"} Dec 02 08:33:43 crc kubenswrapper[4691]: I1202 08:33:43.226576 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerStarted","Data":"0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea"} Dec 02 08:33:43 crc kubenswrapper[4691]: I1202 08:33:43.254345 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7tnqh" podStartSLOduration=3.237187129 podStartE2EDuration="7.254317728s" podCreationTimestamp="2025-12-02 08:33:36 +0000 UTC" firstStartedPulling="2025-12-02 08:33:38.16387552 +0000 UTC m=+2865.947954382" lastFinishedPulling="2025-12-02 08:33:42.181006119 +0000 UTC m=+2869.965084981" observedRunningTime="2025-12-02 08:33:43.250841501 +0000 UTC m=+2871.034920383" watchObservedRunningTime="2025-12-02 08:33:43.254317728 +0000 UTC m=+2871.038396590" Dec 02 08:33:46 crc kubenswrapper[4691]: I1202 08:33:46.971494 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:46 crc kubenswrapper[4691]: I1202 08:33:46.971881 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:47 crc kubenswrapper[4691]: I1202 08:33:47.049657 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:47 crc kubenswrapper[4691]: I1202 08:33:47.327024 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.288242 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7tnqh"] Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.288955 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7tnqh" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="registry-server" containerID="cri-o://0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea" gracePeriod=2 Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.808486 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.899093 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.899158 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.907243 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-njk8r"] Dec 02 08:33:51 crc kubenswrapper[4691]: E1202 08:33:51.907846 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="registry-server" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.907862 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="registry-server" Dec 02 08:33:51 crc kubenswrapper[4691]: E1202 08:33:51.907895 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="extract-content" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.907905 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="extract-content" Dec 02 08:33:51 crc kubenswrapper[4691]: E1202 08:33:51.907918 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="extract-utilities" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.907928 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="extract-utilities" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.908583 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerName="registry-server" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.911369 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.930278 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njk8r"] Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.963658 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-utilities\") pod \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.963719 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpzzx\" (UniqueName: \"kubernetes.io/projected/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-kube-api-access-tpzzx\") pod \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.963970 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-catalog-content\") pod \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\" (UID: \"931b78b5-2c3d-44f8-bb48-dc0259eb63ec\") " Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.964828 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-utilities" (OuterVolumeSpecName: "utilities") pod "931b78b5-2c3d-44f8-bb48-dc0259eb63ec" (UID: "931b78b5-2c3d-44f8-bb48-dc0259eb63ec"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:33:51 crc kubenswrapper[4691]: I1202 08:33:51.969277 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-kube-api-access-tpzzx" (OuterVolumeSpecName: "kube-api-access-tpzzx") pod "931b78b5-2c3d-44f8-bb48-dc0259eb63ec" (UID: "931b78b5-2c3d-44f8-bb48-dc0259eb63ec"). InnerVolumeSpecName "kube-api-access-tpzzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.033236 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "931b78b5-2c3d-44f8-bb48-dc0259eb63ec" (UID: "931b78b5-2c3d-44f8-bb48-dc0259eb63ec"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.066207 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-utilities\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.066283 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hz4\" (UniqueName: \"kubernetes.io/projected/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-kube-api-access-f9hz4\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.066313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-catalog-content\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.066407 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpzzx\" (UniqueName: \"kubernetes.io/projected/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-kube-api-access-tpzzx\") on node \"crc\" DevicePath \"\"" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.066424 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.066436 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/931b78b5-2c3d-44f8-bb48-dc0259eb63ec-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.175118 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hz4\" (UniqueName: \"kubernetes.io/projected/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-kube-api-access-f9hz4\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.175209 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-catalog-content\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.175429 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-utilities\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.175724 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-catalog-content\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.175889 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-utilities\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.205856 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hz4\" (UniqueName: \"kubernetes.io/projected/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-kube-api-access-f9hz4\") pod \"redhat-operators-njk8r\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.236112 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.323789 4691 generic.go:334] "Generic (PLEG): container finished" podID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" containerID="0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea" exitCode=0 Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.323878 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7tnqh" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.323864 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerDied","Data":"0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea"} Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.323966 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7tnqh" event={"ID":"931b78b5-2c3d-44f8-bb48-dc0259eb63ec","Type":"ContainerDied","Data":"7186f601f3847d3bba54e413ec825d2bac163a81168bfdc7173a7eb17c2c58f7"} Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.324014 4691 scope.go:117] "RemoveContainer" containerID="0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.367051 4691 scope.go:117] "RemoveContainer" containerID="02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.425278 4691 scope.go:117] "RemoveContainer" containerID="2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.436808 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7tnqh"] Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.451440 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7tnqh"] Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.474538 4691 scope.go:117] "RemoveContainer" containerID="0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea" Dec 02 08:33:52 crc kubenswrapper[4691]: E1202 08:33:52.486030 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea\": container with ID starting with 0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea not found: ID does not exist" containerID="0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.486105 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea"} err="failed to get container status \"0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea\": rpc error: code = NotFound desc = could not find container \"0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea\": container with ID starting with 0a5962d8c8b1a683aa6f8118fb783a98473701ea14dc5cea725ca6116863ffea not found: ID does not exist" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.486141 4691 scope.go:117] "RemoveContainer" containerID="02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840" Dec 02 08:33:52 crc kubenswrapper[4691]: E1202 08:33:52.486999 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840\": container with ID starting with 02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840 not found: ID does not exist" containerID="02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.487044 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840"} err="failed to get container status \"02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840\": rpc error: code = NotFound desc = could not find container \"02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840\": container with ID starting with 02104c6626e64e1d86da87d809ea3f8844e5ca6464f383287df8a519ef1d6840 not found: ID does not exist" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.487088 4691 scope.go:117] "RemoveContainer" containerID="2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17" Dec 02 08:33:52 crc kubenswrapper[4691]: E1202 08:33:52.487381 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17\": container with ID starting with 2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17 not found: ID does not exist" containerID="2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.487403 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17"} err="failed to get container status \"2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17\": rpc error: code = NotFound desc = could not find container \"2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17\": container with ID starting with 2329b8f4d77f328c7189754fad037e2d09176a8859fa33c871861b0979b3af17 not found: ID does not exist" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.585521 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931b78b5-2c3d-44f8-bb48-dc0259eb63ec" path="/var/lib/kubelet/pods/931b78b5-2c3d-44f8-bb48-dc0259eb63ec/volumes" Dec 02 08:33:52 crc kubenswrapper[4691]: I1202 08:33:52.771010 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-njk8r"] Dec 02 08:33:53 crc kubenswrapper[4691]: I1202 08:33:53.376448 4691 generic.go:334] "Generic (PLEG): container finished" podID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerID="73c62c6f8e4f046f6d65d0b8db41596ef31c6f9525ae7ce51b4ca8d6cccf683a" exitCode=0 Dec 02 08:33:53 crc kubenswrapper[4691]: I1202 08:33:53.376565 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerDied","Data":"73c62c6f8e4f046f6d65d0b8db41596ef31c6f9525ae7ce51b4ca8d6cccf683a"} Dec 02 08:33:53 crc kubenswrapper[4691]: I1202 08:33:53.376597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerStarted","Data":"5982d1928878e18559d5f737220f6e38efb4f4e09f301a38db3aa64e4a6307c2"} Dec 02 08:33:54 crc kubenswrapper[4691]: I1202 08:33:54.394025 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerStarted","Data":"fa734e2f24d6cd4d382d5be6de986e68467c28e6f52ca8537854da30398bcfa8"} Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.405109 4691 generic.go:334] "Generic (PLEG): container finished" podID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerID="fa734e2f24d6cd4d382d5be6de986e68467c28e6f52ca8537854da30398bcfa8" exitCode=0 Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.405208 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerDied","Data":"fa734e2f24d6cd4d382d5be6de986e68467c28e6f52ca8537854da30398bcfa8"} Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.505042 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9tdbf"] Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.507468 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.522372 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tdbf"] Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.700848 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-utilities\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.700916 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-catalog-content\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.701527 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbzct\" (UniqueName: \"kubernetes.io/projected/9f94a235-c106-49a9-a61a-d3242c936067-kube-api-access-zbzct\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.803279 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbzct\" (UniqueName: \"kubernetes.io/projected/9f94a235-c106-49a9-a61a-d3242c936067-kube-api-access-zbzct\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.803388 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-utilities\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.803417 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-catalog-content\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.804109 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-utilities\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.804134 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-catalog-content\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.825444 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbzct\" (UniqueName: \"kubernetes.io/projected/9f94a235-c106-49a9-a61a-d3242c936067-kube-api-access-zbzct\") pod \"certified-operators-9tdbf\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.839437 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:33:55 crc kubenswrapper[4691]: I1202 08:33:55.945112 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bnfmx"] Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.009794 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnfmx"] Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.010243 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.157397 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-catalog-content\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.157717 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqcbl\" (UniqueName: \"kubernetes.io/projected/8a826d05-0c5c-4091-bf21-c182af664bd8-kube-api-access-mqcbl\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.157985 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-utilities\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.261447 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-utilities\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.261511 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-catalog-content\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.261649 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqcbl\" (UniqueName: \"kubernetes.io/projected/8a826d05-0c5c-4091-bf21-c182af664bd8-kube-api-access-mqcbl\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.262662 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-utilities\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.262984 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-catalog-content\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.296936 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqcbl\" (UniqueName: \"kubernetes.io/projected/8a826d05-0c5c-4091-bf21-c182af664bd8-kube-api-access-mqcbl\") pod \"redhat-marketplace-bnfmx\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.358480 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.500422 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9tdbf"] Dec 02 08:33:56 crc kubenswrapper[4691]: W1202 08:33:56.932094 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a826d05_0c5c_4091_bf21_c182af664bd8.slice/crio-792709769b7676b555b1ae37941ce53fb6519ef7504d94494071778aec1e3032 WatchSource:0}: Error finding container 792709769b7676b555b1ae37941ce53fb6519ef7504d94494071778aec1e3032: Status 404 returned error can't find the container with id 792709769b7676b555b1ae37941ce53fb6519ef7504d94494071778aec1e3032 Dec 02 08:33:56 crc kubenswrapper[4691]: I1202 08:33:56.933987 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnfmx"] Dec 02 08:33:57 crc kubenswrapper[4691]: I1202 08:33:57.452630 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerStarted","Data":"792709769b7676b555b1ae37941ce53fb6519ef7504d94494071778aec1e3032"} Dec 02 08:33:57 crc kubenswrapper[4691]: I1202 08:33:57.461409 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerStarted","Data":"34d6586f315ef2e2ec660611e51fd9d19b77dcabe9e6c9a4ff9c619e0e08f0d6"} Dec 02 08:33:57 crc kubenswrapper[4691]: I1202 08:33:57.463637 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerStarted","Data":"7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8"} Dec 02 08:33:57 crc kubenswrapper[4691]: I1202 08:33:57.463680 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerStarted","Data":"b1abe5b308b7b547df700a37f7fa1363164e0198253d178fff62a8391108a3a0"} Dec 02 08:33:57 crc kubenswrapper[4691]: I1202 08:33:57.491968 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-njk8r" podStartSLOduration=3.842843938 podStartE2EDuration="6.491952931s" podCreationTimestamp="2025-12-02 08:33:51 +0000 UTC" firstStartedPulling="2025-12-02 08:33:53.381686903 +0000 UTC m=+2881.165765765" lastFinishedPulling="2025-12-02 08:33:56.030795896 +0000 UTC m=+2883.814874758" observedRunningTime="2025-12-02 08:33:57.48952939 +0000 UTC m=+2885.273608282" watchObservedRunningTime="2025-12-02 08:33:57.491952931 +0000 UTC m=+2885.276031793" Dec 02 08:33:58 crc kubenswrapper[4691]: I1202 08:33:58.474311 4691 generic.go:334] "Generic (PLEG): container finished" podID="9f94a235-c106-49a9-a61a-d3242c936067" containerID="7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8" exitCode=0 Dec 02 08:33:58 crc kubenswrapper[4691]: I1202 08:33:58.474377 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerDied","Data":"7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8"} Dec 02 08:33:58 crc kubenswrapper[4691]: I1202 08:33:58.476792 4691 generic.go:334] "Generic (PLEG): container finished" podID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerID="fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b" exitCode=0 Dec 02 08:33:58 crc kubenswrapper[4691]: I1202 08:33:58.476960 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerDied","Data":"fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b"} Dec 02 08:33:59 crc kubenswrapper[4691]: I1202 08:33:59.493937 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerStarted","Data":"d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f"} Dec 02 08:33:59 crc kubenswrapper[4691]: I1202 08:33:59.498581 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerStarted","Data":"fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db"} Dec 02 08:34:00 crc kubenswrapper[4691]: I1202 08:34:00.509374 4691 generic.go:334] "Generic (PLEG): container finished" podID="9f94a235-c106-49a9-a61a-d3242c936067" containerID="fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db" exitCode=0 Dec 02 08:34:00 crc kubenswrapper[4691]: I1202 08:34:00.509438 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerDied","Data":"fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db"} Dec 02 08:34:00 crc kubenswrapper[4691]: I1202 08:34:00.511437 4691 generic.go:334] "Generic (PLEG): container finished" podID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerID="d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f" exitCode=0 Dec 02 08:34:00 crc kubenswrapper[4691]: I1202 08:34:00.511467 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerDied","Data":"d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f"} Dec 02 08:34:02 crc kubenswrapper[4691]: I1202 08:34:02.237174 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:34:02 crc kubenswrapper[4691]: I1202 08:34:02.237823 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:34:02 crc kubenswrapper[4691]: I1202 08:34:02.534987 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerStarted","Data":"112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c"} Dec 02 08:34:02 crc kubenswrapper[4691]: I1202 08:34:02.538991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerStarted","Data":"aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c"} Dec 02 08:34:02 crc kubenswrapper[4691]: I1202 08:34:02.586879 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bnfmx" podStartSLOduration=4.664009089 podStartE2EDuration="7.586863442s" podCreationTimestamp="2025-12-02 08:33:55 +0000 UTC" firstStartedPulling="2025-12-02 08:33:58.479232391 +0000 UTC m=+2886.263311253" lastFinishedPulling="2025-12-02 08:34:01.402086744 +0000 UTC m=+2889.186165606" observedRunningTime="2025-12-02 08:34:02.5835979 +0000 UTC m=+2890.367676762" watchObservedRunningTime="2025-12-02 08:34:02.586863442 +0000 UTC m=+2890.370942294" Dec 02 08:34:02 crc kubenswrapper[4691]: I1202 08:34:02.588485 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9tdbf" podStartSLOduration=3.157881915 podStartE2EDuration="7.588478823s" podCreationTimestamp="2025-12-02 08:33:55 +0000 UTC" firstStartedPulling="2025-12-02 08:33:57.465208849 +0000 UTC m=+2885.249287711" lastFinishedPulling="2025-12-02 08:34:01.895805767 +0000 UTC m=+2889.679884619" observedRunningTime="2025-12-02 08:34:02.555487605 +0000 UTC m=+2890.339566477" watchObservedRunningTime="2025-12-02 08:34:02.588478823 +0000 UTC m=+2890.372557685" Dec 02 08:34:03 crc kubenswrapper[4691]: I1202 08:34:03.299775 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-njk8r" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="registry-server" probeResult="failure" output=< Dec 02 08:34:03 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 08:34:03 crc kubenswrapper[4691]: > Dec 02 08:34:05 crc kubenswrapper[4691]: I1202 08:34:05.840240 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:34:05 crc kubenswrapper[4691]: I1202 08:34:05.840628 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:34:05 crc kubenswrapper[4691]: I1202 08:34:05.895615 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:34:06 crc kubenswrapper[4691]: I1202 08:34:06.359860 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:34:06 crc kubenswrapper[4691]: I1202 08:34:06.360246 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:34:06 crc kubenswrapper[4691]: I1202 08:34:06.412403 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:34:06 crc kubenswrapper[4691]: I1202 08:34:06.633011 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:34:07 crc kubenswrapper[4691]: I1202 08:34:07.887795 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnfmx"] Dec 02 08:34:08 crc kubenswrapper[4691]: I1202 08:34:08.589477 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bnfmx" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="registry-server" containerID="cri-o://aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c" gracePeriod=2 Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.138428 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.285694 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-catalog-content\") pod \"8a826d05-0c5c-4091-bf21-c182af664bd8\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.285924 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqcbl\" (UniqueName: \"kubernetes.io/projected/8a826d05-0c5c-4091-bf21-c182af664bd8-kube-api-access-mqcbl\") pod \"8a826d05-0c5c-4091-bf21-c182af664bd8\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.285979 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-utilities\") pod \"8a826d05-0c5c-4091-bf21-c182af664bd8\" (UID: \"8a826d05-0c5c-4091-bf21-c182af664bd8\") " Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.286774 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-utilities" (OuterVolumeSpecName: "utilities") pod "8a826d05-0c5c-4091-bf21-c182af664bd8" (UID: "8a826d05-0c5c-4091-bf21-c182af664bd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.291791 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a826d05-0c5c-4091-bf21-c182af664bd8-kube-api-access-mqcbl" (OuterVolumeSpecName: "kube-api-access-mqcbl") pod "8a826d05-0c5c-4091-bf21-c182af664bd8" (UID: "8a826d05-0c5c-4091-bf21-c182af664bd8"). InnerVolumeSpecName "kube-api-access-mqcbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.305336 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a826d05-0c5c-4091-bf21-c182af664bd8" (UID: "8a826d05-0c5c-4091-bf21-c182af664bd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.388901 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.388957 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqcbl\" (UniqueName: \"kubernetes.io/projected/8a826d05-0c5c-4091-bf21-c182af664bd8-kube-api-access-mqcbl\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.388969 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a826d05-0c5c-4091-bf21-c182af664bd8-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.599673 4691 generic.go:334] "Generic (PLEG): container finished" podID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerID="aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c" exitCode=0 Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.599722 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerDied","Data":"aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c"} Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.599739 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bnfmx" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.599779 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bnfmx" event={"ID":"8a826d05-0c5c-4091-bf21-c182af664bd8","Type":"ContainerDied","Data":"792709769b7676b555b1ae37941ce53fb6519ef7504d94494071778aec1e3032"} Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.599807 4691 scope.go:117] "RemoveContainer" containerID="aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.620335 4691 scope.go:117] "RemoveContainer" containerID="d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.640687 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnfmx"] Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.650484 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bnfmx"] Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.665080 4691 scope.go:117] "RemoveContainer" containerID="fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.705286 4691 scope.go:117] "RemoveContainer" containerID="aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c" Dec 02 08:34:09 crc kubenswrapper[4691]: E1202 08:34:09.705769 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c\": container with ID starting with aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c not found: ID does not exist" containerID="aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.705831 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c"} err="failed to get container status \"aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c\": rpc error: code = NotFound desc = could not find container \"aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c\": container with ID starting with aad37125d824fee9f8e81e81b9c11f6cbeafd68631d659fc104c0643f4e75a3c not found: ID does not exist" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.705854 4691 scope.go:117] "RemoveContainer" containerID="d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f" Dec 02 08:34:09 crc kubenswrapper[4691]: E1202 08:34:09.706397 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f\": container with ID starting with d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f not found: ID does not exist" containerID="d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.706416 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f"} err="failed to get container status \"d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f\": rpc error: code = NotFound desc = could not find container \"d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f\": container with ID starting with d0add4629ad2ba8016d7dffd3d6cb7a0506be4e7189501d8af34fff259bbf85f not found: ID does not exist" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.706501 4691 scope.go:117] "RemoveContainer" containerID="fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b" Dec 02 08:34:09 crc kubenswrapper[4691]: E1202 08:34:09.706915 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b\": container with ID starting with fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b not found: ID does not exist" containerID="fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b" Dec 02 08:34:09 crc kubenswrapper[4691]: I1202 08:34:09.706945 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b"} err="failed to get container status \"fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b\": rpc error: code = NotFound desc = could not find container \"fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b\": container with ID starting with fa9111c49e70dc79b90884ff1e11e89119ed967184e78c2550bf014357d6114b not found: ID does not exist" Dec 02 08:34:10 crc kubenswrapper[4691]: I1202 08:34:10.574754 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" path="/var/lib/kubelet/pods/8a826d05-0c5c-4091-bf21-c182af664bd8/volumes" Dec 02 08:34:12 crc kubenswrapper[4691]: I1202 08:34:12.282429 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:34:12 crc kubenswrapper[4691]: I1202 08:34:12.329829 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:34:15 crc kubenswrapper[4691]: I1202 08:34:15.951266 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.288044 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-njk8r"] Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.288309 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-njk8r" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="registry-server" containerID="cri-o://34d6586f315ef2e2ec660611e51fd9d19b77dcabe9e6c9a4ff9c619e0e08f0d6" gracePeriod=2 Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.667127 4691 generic.go:334] "Generic (PLEG): container finished" podID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerID="34d6586f315ef2e2ec660611e51fd9d19b77dcabe9e6c9a4ff9c619e0e08f0d6" exitCode=0 Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.667207 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerDied","Data":"34d6586f315ef2e2ec660611e51fd9d19b77dcabe9e6c9a4ff9c619e0e08f0d6"} Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.763060 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.875902 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-utilities\") pod \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.876111 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9hz4\" (UniqueName: \"kubernetes.io/projected/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-kube-api-access-f9hz4\") pod \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.876132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-catalog-content\") pod \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\" (UID: \"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f\") " Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.877087 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-utilities" (OuterVolumeSpecName: "utilities") pod "0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" (UID: "0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.882645 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-kube-api-access-f9hz4" (OuterVolumeSpecName: "kube-api-access-f9hz4") pod "0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" (UID: "0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f"). InnerVolumeSpecName "kube-api-access-f9hz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.978213 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9hz4\" (UniqueName: \"kubernetes.io/projected/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-kube-api-access-f9hz4\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.978258 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:16 crc kubenswrapper[4691]: I1202 08:34:16.990546 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" (UID: "0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.079862 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.679590 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-njk8r" event={"ID":"0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f","Type":"ContainerDied","Data":"5982d1928878e18559d5f737220f6e38efb4f4e09f301a38db3aa64e4a6307c2"} Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.679645 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-njk8r" Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.679663 4691 scope.go:117] "RemoveContainer" containerID="34d6586f315ef2e2ec660611e51fd9d19b77dcabe9e6c9a4ff9c619e0e08f0d6" Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.705271 4691 scope.go:117] "RemoveContainer" containerID="fa734e2f24d6cd4d382d5be6de986e68467c28e6f52ca8537854da30398bcfa8" Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.727186 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-njk8r"] Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.735378 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-njk8r"] Dec 02 08:34:17 crc kubenswrapper[4691]: I1202 08:34:17.754936 4691 scope.go:117] "RemoveContainer" containerID="73c62c6f8e4f046f6d65d0b8db41596ef31c6f9525ae7ce51b4ca8d6cccf683a" Dec 02 08:34:18 crc kubenswrapper[4691]: I1202 08:34:18.573007 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" path="/var/lib/kubelet/pods/0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f/volumes" Dec 02 08:34:18 crc kubenswrapper[4691]: I1202 08:34:18.689189 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tdbf"] Dec 02 08:34:18 crc kubenswrapper[4691]: I1202 08:34:18.689413 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9tdbf" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="registry-server" containerID="cri-o://112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c" gracePeriod=2 Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.183579 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.335075 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbzct\" (UniqueName: \"kubernetes.io/projected/9f94a235-c106-49a9-a61a-d3242c936067-kube-api-access-zbzct\") pod \"9f94a235-c106-49a9-a61a-d3242c936067\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.335289 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-utilities\") pod \"9f94a235-c106-49a9-a61a-d3242c936067\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.335535 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-catalog-content\") pod \"9f94a235-c106-49a9-a61a-d3242c936067\" (UID: \"9f94a235-c106-49a9-a61a-d3242c936067\") " Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.339714 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-utilities" (OuterVolumeSpecName: "utilities") pod "9f94a235-c106-49a9-a61a-d3242c936067" (UID: "9f94a235-c106-49a9-a61a-d3242c936067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.349076 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f94a235-c106-49a9-a61a-d3242c936067-kube-api-access-zbzct" (OuterVolumeSpecName: "kube-api-access-zbzct") pod "9f94a235-c106-49a9-a61a-d3242c936067" (UID: "9f94a235-c106-49a9-a61a-d3242c936067"). InnerVolumeSpecName "kube-api-access-zbzct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.394192 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f94a235-c106-49a9-a61a-d3242c936067" (UID: "9f94a235-c106-49a9-a61a-d3242c936067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.438411 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.438465 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbzct\" (UniqueName: \"kubernetes.io/projected/9f94a235-c106-49a9-a61a-d3242c936067-kube-api-access-zbzct\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.438481 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f94a235-c106-49a9-a61a-d3242c936067-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.702621 4691 generic.go:334] "Generic (PLEG): container finished" podID="9f94a235-c106-49a9-a61a-d3242c936067" containerID="112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c" exitCode=0 Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.702676 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerDied","Data":"112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c"} Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.702713 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9tdbf" event={"ID":"9f94a235-c106-49a9-a61a-d3242c936067","Type":"ContainerDied","Data":"b1abe5b308b7b547df700a37f7fa1363164e0198253d178fff62a8391108a3a0"} Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.702747 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9tdbf" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.702748 4691 scope.go:117] "RemoveContainer" containerID="112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.733097 4691 scope.go:117] "RemoveContainer" containerID="fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.743155 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9tdbf"] Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.754889 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9tdbf"] Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.762841 4691 scope.go:117] "RemoveContainer" containerID="7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.816796 4691 scope.go:117] "RemoveContainer" containerID="112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c" Dec 02 08:34:19 crc kubenswrapper[4691]: E1202 08:34:19.817839 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c\": container with ID starting with 112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c not found: ID does not exist" containerID="112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.817887 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c"} err="failed to get container status \"112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c\": rpc error: code = NotFound desc = could not find container \"112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c\": container with ID starting with 112e99fdca1331e0b8431d0008f99e4125435cc70e3702eceb27fa841efa4f3c not found: ID does not exist" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.817917 4691 scope.go:117] "RemoveContainer" containerID="fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db" Dec 02 08:34:19 crc kubenswrapper[4691]: E1202 08:34:19.818364 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db\": container with ID starting with fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db not found: ID does not exist" containerID="fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.818438 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db"} err="failed to get container status \"fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db\": rpc error: code = NotFound desc = could not find container \"fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db\": container with ID starting with fe97eafabd15ad077e96be36cf24b740c7cf5b8db6a615b2c6af17bd6ff836db not found: ID does not exist" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.818453 4691 scope.go:117] "RemoveContainer" containerID="7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8" Dec 02 08:34:19 crc kubenswrapper[4691]: E1202 08:34:19.818955 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8\": container with ID starting with 7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8 not found: ID does not exist" containerID="7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8" Dec 02 08:34:19 crc kubenswrapper[4691]: I1202 08:34:19.819033 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8"} err="failed to get container status \"7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8\": rpc error: code = NotFound desc = could not find container \"7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8\": container with ID starting with 7c539437eb6acb07ab954c556ba41b620a6eff6a7b984fa8bb40a16f53f3a3a8 not found: ID does not exist" Dec 02 08:34:20 crc kubenswrapper[4691]: I1202 08:34:20.574737 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f94a235-c106-49a9-a61a-d3242c936067" path="/var/lib/kubelet/pods/9f94a235-c106-49a9-a61a-d3242c936067/volumes" Dec 02 08:34:21 crc kubenswrapper[4691]: I1202 08:34:21.901335 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:34:21 crc kubenswrapper[4691]: I1202 08:34:21.901395 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:34:21 crc kubenswrapper[4691]: I1202 08:34:21.901456 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:34:21 crc kubenswrapper[4691]: I1202 08:34:21.905863 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"387a9f01752980f7a895280ce4d44745ff4d7bd96a061e741af6408308862c87"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:34:21 crc kubenswrapper[4691]: I1202 08:34:21.905956 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://387a9f01752980f7a895280ce4d44745ff4d7bd96a061e741af6408308862c87" gracePeriod=600 Dec 02 08:34:22 crc kubenswrapper[4691]: I1202 08:34:22.931177 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="387a9f01752980f7a895280ce4d44745ff4d7bd96a061e741af6408308862c87" exitCode=0 Dec 02 08:34:22 crc kubenswrapper[4691]: I1202 08:34:22.931237 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"387a9f01752980f7a895280ce4d44745ff4d7bd96a061e741af6408308862c87"} Dec 02 08:34:22 crc kubenswrapper[4691]: I1202 08:34:22.931808 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9"} Dec 02 08:34:22 crc kubenswrapper[4691]: I1202 08:34:22.931854 4691 scope.go:117] "RemoveContainer" containerID="6ae5611aab101089d8b8a2f19ececc62affe8f98b0e79e5be5e89d8e3c8ffa8d" Dec 02 08:35:49 crc kubenswrapper[4691]: I1202 08:35:49.879021 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="hostpath-provisioner/csi-hostpathplugin-2pv42" podUID="1eb687f4-51f6-4806-b5a0-e35639b4b019" containerName="hostpath-provisioner" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 02 08:36:51 crc kubenswrapper[4691]: I1202 08:36:51.898943 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:36:51 crc kubenswrapper[4691]: I1202 08:36:51.900480 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:37:21 crc kubenswrapper[4691]: I1202 08:37:21.898974 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:37:21 crc kubenswrapper[4691]: I1202 08:37:21.899917 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:37:51 crc kubenswrapper[4691]: I1202 08:37:51.899572 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:37:51 crc kubenswrapper[4691]: I1202 08:37:51.900343 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:37:51 crc kubenswrapper[4691]: I1202 08:37:51.900415 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:37:51 crc kubenswrapper[4691]: I1202 08:37:51.901403 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:37:51 crc kubenswrapper[4691]: I1202 08:37:51.901464 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" gracePeriod=600 Dec 02 08:37:52 crc kubenswrapper[4691]: E1202 08:37:52.025077 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:37:52 crc kubenswrapper[4691]: I1202 08:37:52.264711 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" exitCode=0 Dec 02 08:37:52 crc kubenswrapper[4691]: I1202 08:37:52.264734 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9"} Dec 02 08:37:52 crc kubenswrapper[4691]: I1202 08:37:52.264879 4691 scope.go:117] "RemoveContainer" containerID="387a9f01752980f7a895280ce4d44745ff4d7bd96a061e741af6408308862c87" Dec 02 08:37:52 crc kubenswrapper[4691]: I1202 08:37:52.265536 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:37:52 crc kubenswrapper[4691]: E1202 08:37:52.265918 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:38:03 crc kubenswrapper[4691]: I1202 08:38:03.561261 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:38:03 crc kubenswrapper[4691]: E1202 08:38:03.562186 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:38:17 crc kubenswrapper[4691]: I1202 08:38:17.561795 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:38:17 crc kubenswrapper[4691]: E1202 08:38:17.562790 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:38:28 crc kubenswrapper[4691]: I1202 08:38:28.561912 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:38:28 crc kubenswrapper[4691]: E1202 08:38:28.562960 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:38:43 crc kubenswrapper[4691]: I1202 08:38:43.562212 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:38:43 crc kubenswrapper[4691]: E1202 08:38:43.563177 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:38:56 crc kubenswrapper[4691]: I1202 08:38:56.562900 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:38:56 crc kubenswrapper[4691]: E1202 08:38:56.564074 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:39:11 crc kubenswrapper[4691]: I1202 08:39:11.562131 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:39:11 crc kubenswrapper[4691]: E1202 08:39:11.564211 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:39:24 crc kubenswrapper[4691]: I1202 08:39:24.562394 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:39:24 crc kubenswrapper[4691]: E1202 08:39:24.563340 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:39:37 crc kubenswrapper[4691]: I1202 08:39:37.562830 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:39:37 crc kubenswrapper[4691]: E1202 08:39:37.563995 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:39:52 crc kubenswrapper[4691]: I1202 08:39:52.576084 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:39:52 crc kubenswrapper[4691]: E1202 08:39:52.577079 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:40:05 crc kubenswrapper[4691]: I1202 08:40:05.561950 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:40:05 crc kubenswrapper[4691]: E1202 08:40:05.562840 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:40:20 crc kubenswrapper[4691]: I1202 08:40:20.562544 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:40:20 crc kubenswrapper[4691]: E1202 08:40:20.563499 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:40:31 crc kubenswrapper[4691]: I1202 08:40:31.561823 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:40:31 crc kubenswrapper[4691]: E1202 08:40:31.562543 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:40:45 crc kubenswrapper[4691]: I1202 08:40:45.562119 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:40:45 crc kubenswrapper[4691]: E1202 08:40:45.562822 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:40:56 crc kubenswrapper[4691]: I1202 08:40:56.561480 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:40:56 crc kubenswrapper[4691]: E1202 08:40:56.562360 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:41:07 crc kubenswrapper[4691]: I1202 08:41:07.564620 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:41:07 crc kubenswrapper[4691]: E1202 08:41:07.566041 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:41:19 crc kubenswrapper[4691]: I1202 08:41:19.561969 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:41:19 crc kubenswrapper[4691]: E1202 08:41:19.562694 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:41:33 crc kubenswrapper[4691]: I1202 08:41:33.561921 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:41:33 crc kubenswrapper[4691]: E1202 08:41:33.562855 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:41:44 crc kubenswrapper[4691]: I1202 08:41:44.562164 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:41:44 crc kubenswrapper[4691]: E1202 08:41:44.563213 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:41:56 crc kubenswrapper[4691]: I1202 08:41:56.562527 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:41:56 crc kubenswrapper[4691]: E1202 08:41:56.563804 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:42:10 crc kubenswrapper[4691]: I1202 08:42:10.561460 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:42:10 crc kubenswrapper[4691]: E1202 08:42:10.563161 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:42:22 crc kubenswrapper[4691]: I1202 08:42:22.570271 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:42:22 crc kubenswrapper[4691]: E1202 08:42:22.572438 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:42:35 crc kubenswrapper[4691]: I1202 08:42:35.561148 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:42:35 crc kubenswrapper[4691]: E1202 08:42:35.562125 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:42:46 crc kubenswrapper[4691]: I1202 08:42:46.561318 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:42:46 crc kubenswrapper[4691]: E1202 08:42:46.562271 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:43:01 crc kubenswrapper[4691]: I1202 08:43:01.561247 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:43:01 crc kubenswrapper[4691]: I1202 08:43:01.969617 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"6047dc0d1f247765ff138c50dd8d084fbe9f7fca33383efa09c95136136de241"} Dec 02 08:43:17 crc kubenswrapper[4691]: I1202 08:43:17.165630 4691 generic.go:334] "Generic (PLEG): container finished" podID="0d635e45-a63d-4661-9b82-b21d8ce59623" containerID="ec797487da4d91d41b1724601ab823731bfa893326761b6e060b120a234eb9f1" exitCode=0 Dec 02 08:43:17 crc kubenswrapper[4691]: I1202 08:43:17.165695 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d635e45-a63d-4661-9b82-b21d8ce59623","Type":"ContainerDied","Data":"ec797487da4d91d41b1724601ab823731bfa893326761b6e060b120a234eb9f1"} Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.576404 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.681686 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpvfm\" (UniqueName: \"kubernetes.io/projected/0d635e45-a63d-4661-9b82-b21d8ce59623-kube-api-access-hpvfm\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.681822 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ssh-key\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.681912 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.682030 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-config-data\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.682080 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config-secret\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.682123 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.682205 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-temporary\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.682308 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-workdir\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.682398 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ca-certs\") pod \"0d635e45-a63d-4661-9b82-b21d8ce59623\" (UID: \"0d635e45-a63d-4661-9b82-b21d8ce59623\") " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.684119 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.684242 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-config-data" (OuterVolumeSpecName: "config-data") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.689423 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.689477 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.689511 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d635e45-a63d-4661-9b82-b21d8ce59623-kube-api-access-hpvfm" (OuterVolumeSpecName: "kube-api-access-hpvfm") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "kube-api-access-hpvfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.719448 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.721992 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.723182 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.736194 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "0d635e45-a63d-4661-9b82-b21d8ce59623" (UID: "0d635e45-a63d-4661-9b82-b21d8ce59623"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786088 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpvfm\" (UniqueName: \"kubernetes.io/projected/0d635e45-a63d-4661-9b82-b21d8ce59623-kube-api-access-hpvfm\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786334 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786464 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786555 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786647 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786730 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0d635e45-a63d-4661-9b82-b21d8ce59623-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786846 4691 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.786934 4691 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/0d635e45-a63d-4661-9b82-b21d8ce59623-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.787017 4691 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/0d635e45-a63d-4661-9b82-b21d8ce59623-ca-certs\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.810311 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Dec 02 08:43:18 crc kubenswrapper[4691]: I1202 08:43:18.889442 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Dec 02 08:43:19 crc kubenswrapper[4691]: I1202 08:43:19.187117 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"0d635e45-a63d-4661-9b82-b21d8ce59623","Type":"ContainerDied","Data":"eabd4661d30fb362410fd012ac9af59f51b290d43b073f41ec5130b3f1467e8d"} Dec 02 08:43:19 crc kubenswrapper[4691]: I1202 08:43:19.187170 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eabd4661d30fb362410fd012ac9af59f51b290d43b073f41ec5130b3f1467e8d" Dec 02 08:43:19 crc kubenswrapper[4691]: I1202 08:43:19.187228 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.234858 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.237149 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="extract-content" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.237266 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="extract-content" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.237351 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.237411 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.237489 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="extract-utilities" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.237566 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="extract-utilities" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.238705 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="extract-utilities" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.238837 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="extract-utilities" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.238952 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d635e45-a63d-4661-9b82-b21d8ce59623" containerName="tempest-tests-tempest-tests-runner" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239018 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d635e45-a63d-4661-9b82-b21d8ce59623" containerName="tempest-tests-tempest-tests-runner" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.239084 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="extract-utilities" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239156 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="extract-utilities" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.239225 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="extract-content" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239283 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="extract-content" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.239371 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239434 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.239497 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239552 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: E1202 08:43:22.239619 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="extract-content" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239672 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="extract-content" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.239992 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f94a235-c106-49a9-a61a-d3242c936067" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.240076 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a6f4abb-86bf-46a8-a6b3-c4ddfd62937f" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.240146 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d635e45-a63d-4661-9b82-b21d8ce59623" containerName="tempest-tests-tempest-tests-runner" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.240213 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a826d05-0c5c-4091-bf21-c182af664bd8" containerName="registry-server" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.241064 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.246060 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.247627 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-mtcjg" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.354453 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmbwk\" (UniqueName: \"kubernetes.io/projected/0e4e77b4-f638-453e-9408-61dae4d0f68a-kube-api-access-dmbwk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.354628 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.456701 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.456830 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmbwk\" (UniqueName: \"kubernetes.io/projected/0e4e77b4-f638-453e-9408-61dae4d0f68a-kube-api-access-dmbwk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.457378 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.483146 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmbwk\" (UniqueName: \"kubernetes.io/projected/0e4e77b4-f638-453e-9408-61dae4d0f68a-kube-api-access-dmbwk\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.496714 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"0e4e77b4-f638-453e-9408-61dae4d0f68a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:22 crc kubenswrapper[4691]: I1202 08:43:22.570688 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Dec 02 08:43:23 crc kubenswrapper[4691]: I1202 08:43:23.008669 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Dec 02 08:43:23 crc kubenswrapper[4691]: I1202 08:43:23.008991 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:43:23 crc kubenswrapper[4691]: I1202 08:43:23.233541 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0e4e77b4-f638-453e-9408-61dae4d0f68a","Type":"ContainerStarted","Data":"f6b1afe69c081be90bbd0358bfd599c04e8e7f60be463dc863b07e34ae733032"} Dec 02 08:43:25 crc kubenswrapper[4691]: I1202 08:43:25.254281 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"0e4e77b4-f638-453e-9408-61dae4d0f68a","Type":"ContainerStarted","Data":"b8286bc00a604517411aeb87d421960318a7dca07215caf96ca5efa739a456df"} Dec 02 08:43:25 crc kubenswrapper[4691]: I1202 08:43:25.274443 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.663106779 podStartE2EDuration="3.274421861s" podCreationTimestamp="2025-12-02 08:43:22 +0000 UTC" firstStartedPulling="2025-12-02 08:43:23.008020443 +0000 UTC m=+3450.792099305" lastFinishedPulling="2025-12-02 08:43:24.619335525 +0000 UTC m=+3452.403414387" observedRunningTime="2025-12-02 08:43:25.267593868 +0000 UTC m=+3453.051672750" watchObservedRunningTime="2025-12-02 08:43:25.274421861 +0000 UTC m=+3453.058500723" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.503083 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4lkz8/must-gather-jc7xh"] Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.507472 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.516425 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4lkz8"/"kube-root-ca.crt" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.516630 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-4lkz8"/"openshift-service-ca.crt" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.517884 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4lkz8/must-gather-jc7xh"] Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.527883 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f74c8c7-5577-4b38-824e-0ef73b775f64-must-gather-output\") pod \"must-gather-jc7xh\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.527982 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz562\" (UniqueName: \"kubernetes.io/projected/0f74c8c7-5577-4b38-824e-0ef73b775f64-kube-api-access-bz562\") pod \"must-gather-jc7xh\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.629532 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz562\" (UniqueName: \"kubernetes.io/projected/0f74c8c7-5577-4b38-824e-0ef73b775f64-kube-api-access-bz562\") pod \"must-gather-jc7xh\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.630666 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f74c8c7-5577-4b38-824e-0ef73b775f64-must-gather-output\") pod \"must-gather-jc7xh\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.631660 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f74c8c7-5577-4b38-824e-0ef73b775f64-must-gather-output\") pod \"must-gather-jc7xh\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.649716 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz562\" (UniqueName: \"kubernetes.io/projected/0f74c8c7-5577-4b38-824e-0ef73b775f64-kube-api-access-bz562\") pod \"must-gather-jc7xh\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:47 crc kubenswrapper[4691]: I1202 08:43:47.838180 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:43:48 crc kubenswrapper[4691]: I1202 08:43:48.339653 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-4lkz8/must-gather-jc7xh"] Dec 02 08:43:48 crc kubenswrapper[4691]: W1202 08:43:48.351782 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f74c8c7_5577_4b38_824e_0ef73b775f64.slice/crio-dcfddb6ce1c8908caa1cae364f6f7708f3be59514c0dbc43933234a61f2d86e5 WatchSource:0}: Error finding container dcfddb6ce1c8908caa1cae364f6f7708f3be59514c0dbc43933234a61f2d86e5: Status 404 returned error can't find the container with id dcfddb6ce1c8908caa1cae364f6f7708f3be59514c0dbc43933234a61f2d86e5 Dec 02 08:43:48 crc kubenswrapper[4691]: I1202 08:43:48.514114 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" event={"ID":"0f74c8c7-5577-4b38-824e-0ef73b775f64","Type":"ContainerStarted","Data":"dcfddb6ce1c8908caa1cae364f6f7708f3be59514c0dbc43933234a61f2d86e5"} Dec 02 08:43:52 crc kubenswrapper[4691]: I1202 08:43:52.558795 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" event={"ID":"0f74c8c7-5577-4b38-824e-0ef73b775f64","Type":"ContainerStarted","Data":"286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1"} Dec 02 08:43:53 crc kubenswrapper[4691]: I1202 08:43:53.568242 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" event={"ID":"0f74c8c7-5577-4b38-824e-0ef73b775f64","Type":"ContainerStarted","Data":"2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129"} Dec 02 08:43:53 crc kubenswrapper[4691]: I1202 08:43:53.591889 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" podStartSLOduration=2.828069351 podStartE2EDuration="6.591870378s" podCreationTimestamp="2025-12-02 08:43:47 +0000 UTC" firstStartedPulling="2025-12-02 08:43:48.365650517 +0000 UTC m=+3476.149729379" lastFinishedPulling="2025-12-02 08:43:52.129451544 +0000 UTC m=+3479.913530406" observedRunningTime="2025-12-02 08:43:53.585598209 +0000 UTC m=+3481.369677081" watchObservedRunningTime="2025-12-02 08:43:53.591870378 +0000 UTC m=+3481.375949240" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.101361 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-jp84m"] Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.103532 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.105991 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4lkz8"/"default-dockercfg-9z8l4" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.208790 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae401371-fff0-480a-9add-11063773f297-host\") pod \"crc-debug-jp84m\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.209060 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86c2n\" (UniqueName: \"kubernetes.io/projected/ae401371-fff0-480a-9add-11063773f297-kube-api-access-86c2n\") pod \"crc-debug-jp84m\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.310619 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae401371-fff0-480a-9add-11063773f297-host\") pod \"crc-debug-jp84m\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.310745 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86c2n\" (UniqueName: \"kubernetes.io/projected/ae401371-fff0-480a-9add-11063773f297-kube-api-access-86c2n\") pod \"crc-debug-jp84m\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.310747 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae401371-fff0-480a-9add-11063773f297-host\") pod \"crc-debug-jp84m\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.336710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86c2n\" (UniqueName: \"kubernetes.io/projected/ae401371-fff0-480a-9add-11063773f297-kube-api-access-86c2n\") pod \"crc-debug-jp84m\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.428483 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:43:56 crc kubenswrapper[4691]: W1202 08:43:56.469773 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae401371_fff0_480a_9add_11063773f297.slice/crio-79af99c9266051b36715801f94215861ef0c9d957661de45c5c8ee77c68cb1a4 WatchSource:0}: Error finding container 79af99c9266051b36715801f94215861ef0c9d957661de45c5c8ee77c68cb1a4: Status 404 returned error can't find the container with id 79af99c9266051b36715801f94215861ef0c9d957661de45c5c8ee77c68cb1a4 Dec 02 08:43:56 crc kubenswrapper[4691]: I1202 08:43:56.619625 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" event={"ID":"ae401371-fff0-480a-9add-11063773f297","Type":"ContainerStarted","Data":"79af99c9266051b36715801f94215861ef0c9d957661de45c5c8ee77c68cb1a4"} Dec 02 08:44:08 crc kubenswrapper[4691]: I1202 08:44:08.738519 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" event={"ID":"ae401371-fff0-480a-9add-11063773f297","Type":"ContainerStarted","Data":"f03b1ae32240fa0cba1fc5277f16e5cadf63f30e69c1e3794665d9b0211e3961"} Dec 02 08:44:08 crc kubenswrapper[4691]: I1202 08:44:08.755938 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" podStartSLOduration=1.449019907 podStartE2EDuration="12.755922109s" podCreationTimestamp="2025-12-02 08:43:56 +0000 UTC" firstStartedPulling="2025-12-02 08:43:56.475007311 +0000 UTC m=+3484.259086163" lastFinishedPulling="2025-12-02 08:44:07.781909503 +0000 UTC m=+3495.565988365" observedRunningTime="2025-12-02 08:44:08.7528085 +0000 UTC m=+3496.536887362" watchObservedRunningTime="2025-12-02 08:44:08.755922109 +0000 UTC m=+3496.540000971" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.608902 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fx7fc"] Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.611994 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.621710 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fx7fc"] Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.704131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkgp\" (UniqueName: \"kubernetes.io/projected/02a16a12-b591-464c-8f23-2ef941088085-kube-api-access-htkgp\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.704255 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-utilities\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.704417 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-catalog-content\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.806425 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkgp\" (UniqueName: \"kubernetes.io/projected/02a16a12-b591-464c-8f23-2ef941088085-kube-api-access-htkgp\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.806501 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-utilities\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.806583 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-catalog-content\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.807531 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-catalog-content\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.807611 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-utilities\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:23 crc kubenswrapper[4691]: I1202 08:44:23.828512 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkgp\" (UniqueName: \"kubernetes.io/projected/02a16a12-b591-464c-8f23-2ef941088085-kube-api-access-htkgp\") pod \"redhat-operators-fx7fc\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:24 crc kubenswrapper[4691]: I1202 08:44:24.994306 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:25 crc kubenswrapper[4691]: I1202 08:44:25.671787 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fx7fc"] Dec 02 08:44:25 crc kubenswrapper[4691]: W1202 08:44:25.673494 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02a16a12_b591_464c_8f23_2ef941088085.slice/crio-5aabf8b39220a12c41d22efb74e1e55c6032f3052f7f3d2cc0ea7491984079f2 WatchSource:0}: Error finding container 5aabf8b39220a12c41d22efb74e1e55c6032f3052f7f3d2cc0ea7491984079f2: Status 404 returned error can't find the container with id 5aabf8b39220a12c41d22efb74e1e55c6032f3052f7f3d2cc0ea7491984079f2 Dec 02 08:44:26 crc kubenswrapper[4691]: I1202 08:44:26.112127 4691 generic.go:334] "Generic (PLEG): container finished" podID="02a16a12-b591-464c-8f23-2ef941088085" containerID="ccc2e79f17b8e7fefdd86decfd09cc2ceda5075a4d5d0debc1b9bce358542af3" exitCode=0 Dec 02 08:44:26 crc kubenswrapper[4691]: I1202 08:44:26.112243 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerDied","Data":"ccc2e79f17b8e7fefdd86decfd09cc2ceda5075a4d5d0debc1b9bce358542af3"} Dec 02 08:44:26 crc kubenswrapper[4691]: I1202 08:44:26.112495 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerStarted","Data":"5aabf8b39220a12c41d22efb74e1e55c6032f3052f7f3d2cc0ea7491984079f2"} Dec 02 08:44:28 crc kubenswrapper[4691]: I1202 08:44:28.137088 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerStarted","Data":"b8a73e30b2b6178e6597200ab04643374f9a1255e9b6ecfef2b9b68e31f47853"} Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.173003 4691 generic.go:334] "Generic (PLEG): container finished" podID="02a16a12-b591-464c-8f23-2ef941088085" containerID="b8a73e30b2b6178e6597200ab04643374f9a1255e9b6ecfef2b9b68e31f47853" exitCode=0 Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.173242 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerDied","Data":"b8a73e30b2b6178e6597200ab04643374f9a1255e9b6ecfef2b9b68e31f47853"} Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.255352 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mcz2j"] Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.258161 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.269087 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcz2j"] Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.369714 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-utilities\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.369825 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqwc7\" (UniqueName: \"kubernetes.io/projected/5345727a-ec29-442f-a119-bad03d64d060-kube-api-access-nqwc7\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.369929 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-catalog-content\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.485926 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-catalog-content\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.486186 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-utilities\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.486600 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqwc7\" (UniqueName: \"kubernetes.io/projected/5345727a-ec29-442f-a119-bad03d64d060-kube-api-access-nqwc7\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.487280 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-catalog-content\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.495040 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-utilities\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.526703 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqwc7\" (UniqueName: \"kubernetes.io/projected/5345727a-ec29-442f-a119-bad03d64d060-kube-api-access-nqwc7\") pod \"community-operators-mcz2j\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:30 crc kubenswrapper[4691]: I1202 08:44:30.606071 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:31 crc kubenswrapper[4691]: I1202 08:44:31.189889 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerStarted","Data":"dbb75be0ed20ae2c0e3a7e736ce78355db436b64ebd9f8623da41eeed7bea306"} Dec 02 08:44:31 crc kubenswrapper[4691]: I1202 08:44:31.216103 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fx7fc" podStartSLOduration=3.631907205 podStartE2EDuration="8.216084519s" podCreationTimestamp="2025-12-02 08:44:23 +0000 UTC" firstStartedPulling="2025-12-02 08:44:26.114658297 +0000 UTC m=+3513.898737159" lastFinishedPulling="2025-12-02 08:44:30.698835611 +0000 UTC m=+3518.482914473" observedRunningTime="2025-12-02 08:44:31.213248957 +0000 UTC m=+3518.997327819" watchObservedRunningTime="2025-12-02 08:44:31.216084519 +0000 UTC m=+3519.000163381" Dec 02 08:44:31 crc kubenswrapper[4691]: I1202 08:44:31.243530 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mcz2j"] Dec 02 08:44:31 crc kubenswrapper[4691]: W1202 08:44:31.254553 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5345727a_ec29_442f_a119_bad03d64d060.slice/crio-7af6a371e343af208b666936dce75fb1e9a0492b7dd55811db7bee454a558c37 WatchSource:0}: Error finding container 7af6a371e343af208b666936dce75fb1e9a0492b7dd55811db7bee454a558c37: Status 404 returned error can't find the container with id 7af6a371e343af208b666936dce75fb1e9a0492b7dd55811db7bee454a558c37 Dec 02 08:44:32 crc kubenswrapper[4691]: I1202 08:44:32.207025 4691 generic.go:334] "Generic (PLEG): container finished" podID="5345727a-ec29-442f-a119-bad03d64d060" containerID="3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc" exitCode=0 Dec 02 08:44:32 crc kubenswrapper[4691]: I1202 08:44:32.207157 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcz2j" event={"ID":"5345727a-ec29-442f-a119-bad03d64d060","Type":"ContainerDied","Data":"3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc"} Dec 02 08:44:32 crc kubenswrapper[4691]: I1202 08:44:32.207481 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcz2j" event={"ID":"5345727a-ec29-442f-a119-bad03d64d060","Type":"ContainerStarted","Data":"7af6a371e343af208b666936dce75fb1e9a0492b7dd55811db7bee454a558c37"} Dec 02 08:44:34 crc kubenswrapper[4691]: I1202 08:44:34.265992 4691 generic.go:334] "Generic (PLEG): container finished" podID="5345727a-ec29-442f-a119-bad03d64d060" containerID="cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d" exitCode=0 Dec 02 08:44:34 crc kubenswrapper[4691]: I1202 08:44:34.266955 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcz2j" event={"ID":"5345727a-ec29-442f-a119-bad03d64d060","Type":"ContainerDied","Data":"cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d"} Dec 02 08:44:34 crc kubenswrapper[4691]: I1202 08:44:34.994558 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:34 crc kubenswrapper[4691]: I1202 08:44:34.995135 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:35 crc kubenswrapper[4691]: I1202 08:44:35.281552 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcz2j" event={"ID":"5345727a-ec29-442f-a119-bad03d64d060","Type":"ContainerStarted","Data":"49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a"} Dec 02 08:44:35 crc kubenswrapper[4691]: I1202 08:44:35.312940 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mcz2j" podStartSLOduration=2.669111706 podStartE2EDuration="5.312908712s" podCreationTimestamp="2025-12-02 08:44:30 +0000 UTC" firstStartedPulling="2025-12-02 08:44:32.208893931 +0000 UTC m=+3519.992972793" lastFinishedPulling="2025-12-02 08:44:34.852690937 +0000 UTC m=+3522.636769799" observedRunningTime="2025-12-02 08:44:35.308209574 +0000 UTC m=+3523.092288456" watchObservedRunningTime="2025-12-02 08:44:35.312908712 +0000 UTC m=+3523.096987574" Dec 02 08:44:36 crc kubenswrapper[4691]: I1202 08:44:36.054584 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fx7fc" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="registry-server" probeResult="failure" output=< Dec 02 08:44:36 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 08:44:36 crc kubenswrapper[4691]: > Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.615535 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hp5hq"] Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.618862 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.631091 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hp5hq"] Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.755467 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-catalog-content\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.755747 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc4lf\" (UniqueName: \"kubernetes.io/projected/2d97227e-a77e-4107-a3a1-24a46c600da3-kube-api-access-vc4lf\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.755874 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-utilities\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.857407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc4lf\" (UniqueName: \"kubernetes.io/projected/2d97227e-a77e-4107-a3a1-24a46c600da3-kube-api-access-vc4lf\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.857858 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-utilities\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.857909 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-catalog-content\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.858428 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-catalog-content\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.858515 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-utilities\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.903853 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc4lf\" (UniqueName: \"kubernetes.io/projected/2d97227e-a77e-4107-a3a1-24a46c600da3-kube-api-access-vc4lf\") pod \"certified-operators-hp5hq\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:37 crc kubenswrapper[4691]: I1202 08:44:37.950231 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:38 crc kubenswrapper[4691]: I1202 08:44:38.607873 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hp5hq"] Dec 02 08:44:39 crc kubenswrapper[4691]: I1202 08:44:39.333528 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerStarted","Data":"a1764e0e7a3b4b2219f4dfb991a3be3adf0cad37b20f7fc64d1be3b7388bb487"} Dec 02 08:44:40 crc kubenswrapper[4691]: I1202 08:44:40.343907 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerStarted","Data":"429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553"} Dec 02 08:44:40 crc kubenswrapper[4691]: I1202 08:44:40.607161 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:40 crc kubenswrapper[4691]: I1202 08:44:40.607524 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:40 crc kubenswrapper[4691]: I1202 08:44:40.663189 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:41 crc kubenswrapper[4691]: I1202 08:44:41.354341 4691 generic.go:334] "Generic (PLEG): container finished" podID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerID="429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553" exitCode=0 Dec 02 08:44:41 crc kubenswrapper[4691]: I1202 08:44:41.355640 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerDied","Data":"429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553"} Dec 02 08:44:41 crc kubenswrapper[4691]: I1202 08:44:41.408457 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:42 crc kubenswrapper[4691]: I1202 08:44:42.792738 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mcz2j"] Dec 02 08:44:43 crc kubenswrapper[4691]: I1202 08:44:43.377631 4691 generic.go:334] "Generic (PLEG): container finished" podID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerID="fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac" exitCode=0 Dec 02 08:44:43 crc kubenswrapper[4691]: I1202 08:44:43.378122 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerDied","Data":"fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac"} Dec 02 08:44:44 crc kubenswrapper[4691]: I1202 08:44:44.396617 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerStarted","Data":"7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b"} Dec 02 08:44:44 crc kubenswrapper[4691]: I1202 08:44:44.397221 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mcz2j" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="registry-server" containerID="cri-o://49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a" gracePeriod=2 Dec 02 08:44:44 crc kubenswrapper[4691]: I1202 08:44:44.424984 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hp5hq" podStartSLOduration=4.988072925 podStartE2EDuration="7.424965928s" podCreationTimestamp="2025-12-02 08:44:37 +0000 UTC" firstStartedPulling="2025-12-02 08:44:41.360038834 +0000 UTC m=+3529.144117696" lastFinishedPulling="2025-12-02 08:44:43.796931807 +0000 UTC m=+3531.581010699" observedRunningTime="2025-12-02 08:44:44.420422643 +0000 UTC m=+3532.204501515" watchObservedRunningTime="2025-12-02 08:44:44.424965928 +0000 UTC m=+3532.209044790" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.012834 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.061084 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.145513 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqwc7\" (UniqueName: \"kubernetes.io/projected/5345727a-ec29-442f-a119-bad03d64d060-kube-api-access-nqwc7\") pod \"5345727a-ec29-442f-a119-bad03d64d060\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.145601 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-catalog-content\") pod \"5345727a-ec29-442f-a119-bad03d64d060\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.145683 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-utilities\") pod \"5345727a-ec29-442f-a119-bad03d64d060\" (UID: \"5345727a-ec29-442f-a119-bad03d64d060\") " Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.147929 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-utilities" (OuterVolumeSpecName: "utilities") pod "5345727a-ec29-442f-a119-bad03d64d060" (UID: "5345727a-ec29-442f-a119-bad03d64d060"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.151371 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.155034 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5345727a-ec29-442f-a119-bad03d64d060-kube-api-access-nqwc7" (OuterVolumeSpecName: "kube-api-access-nqwc7") pod "5345727a-ec29-442f-a119-bad03d64d060" (UID: "5345727a-ec29-442f-a119-bad03d64d060"). InnerVolumeSpecName "kube-api-access-nqwc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.222061 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5345727a-ec29-442f-a119-bad03d64d060" (UID: "5345727a-ec29-442f-a119-bad03d64d060"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.247685 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqwc7\" (UniqueName: \"kubernetes.io/projected/5345727a-ec29-442f-a119-bad03d64d060-kube-api-access-nqwc7\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.247713 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.247724 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5345727a-ec29-442f-a119-bad03d64d060-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.407710 4691 generic.go:334] "Generic (PLEG): container finished" podID="5345727a-ec29-442f-a119-bad03d64d060" containerID="49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a" exitCode=0 Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.408838 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mcz2j" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.417905 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcz2j" event={"ID":"5345727a-ec29-442f-a119-bad03d64d060","Type":"ContainerDied","Data":"49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a"} Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.417965 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mcz2j" event={"ID":"5345727a-ec29-442f-a119-bad03d64d060","Type":"ContainerDied","Data":"7af6a371e343af208b666936dce75fb1e9a0492b7dd55811db7bee454a558c37"} Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.417991 4691 scope.go:117] "RemoveContainer" containerID="49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.442589 4691 scope.go:117] "RemoveContainer" containerID="cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.444484 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mcz2j"] Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.468838 4691 scope.go:117] "RemoveContainer" containerID="3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.469199 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mcz2j"] Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.518931 4691 scope.go:117] "RemoveContainer" containerID="49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a" Dec 02 08:44:45 crc kubenswrapper[4691]: E1202 08:44:45.519501 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a\": container with ID starting with 49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a not found: ID does not exist" containerID="49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.519616 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a"} err="failed to get container status \"49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a\": rpc error: code = NotFound desc = could not find container \"49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a\": container with ID starting with 49a94317a74bc429ac8ed211e96424162d0f9353e0b628f40950ad95764bee1a not found: ID does not exist" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.519665 4691 scope.go:117] "RemoveContainer" containerID="cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d" Dec 02 08:44:45 crc kubenswrapper[4691]: E1202 08:44:45.520088 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d\": container with ID starting with cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d not found: ID does not exist" containerID="cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.520157 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d"} err="failed to get container status \"cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d\": rpc error: code = NotFound desc = could not find container \"cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d\": container with ID starting with cf8b0677a518c7c2f8dd5d78e287bada757b8a29c9403af6214ab85b1e6b437d not found: ID does not exist" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.520185 4691 scope.go:117] "RemoveContainer" containerID="3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc" Dec 02 08:44:45 crc kubenswrapper[4691]: E1202 08:44:45.520451 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc\": container with ID starting with 3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc not found: ID does not exist" containerID="3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc" Dec 02 08:44:45 crc kubenswrapper[4691]: I1202 08:44:45.520505 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc"} err="failed to get container status \"3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc\": rpc error: code = NotFound desc = could not find container \"3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc\": container with ID starting with 3cc4e611baddf1e0a760fea310b4c028f574f3f65c0799d0bb2b24d3055427fc not found: ID does not exist" Dec 02 08:44:46 crc kubenswrapper[4691]: I1202 08:44:46.575129 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5345727a-ec29-442f-a119-bad03d64d060" path="/var/lib/kubelet/pods/5345727a-ec29-442f-a119-bad03d64d060/volumes" Dec 02 08:44:47 crc kubenswrapper[4691]: I1202 08:44:47.950850 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:47 crc kubenswrapper[4691]: I1202 08:44:47.952007 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:48 crc kubenswrapper[4691]: I1202 08:44:48.005659 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:49 crc kubenswrapper[4691]: I1202 08:44:49.192838 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fx7fc"] Dec 02 08:44:49 crc kubenswrapper[4691]: I1202 08:44:49.193509 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fx7fc" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="registry-server" containerID="cri-o://dbb75be0ed20ae2c0e3a7e736ce78355db436b64ebd9f8623da41eeed7bea306" gracePeriod=2 Dec 02 08:44:49 crc kubenswrapper[4691]: I1202 08:44:49.445072 4691 generic.go:334] "Generic (PLEG): container finished" podID="02a16a12-b591-464c-8f23-2ef941088085" containerID="dbb75be0ed20ae2c0e3a7e736ce78355db436b64ebd9f8623da41eeed7bea306" exitCode=0 Dec 02 08:44:49 crc kubenswrapper[4691]: I1202 08:44:49.445145 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerDied","Data":"dbb75be0ed20ae2c0e3a7e736ce78355db436b64ebd9f8623da41eeed7bea306"} Dec 02 08:44:49 crc kubenswrapper[4691]: I1202 08:44:49.500203 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.208390 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.352733 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-utilities\") pod \"02a16a12-b591-464c-8f23-2ef941088085\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.353303 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htkgp\" (UniqueName: \"kubernetes.io/projected/02a16a12-b591-464c-8f23-2ef941088085-kube-api-access-htkgp\") pod \"02a16a12-b591-464c-8f23-2ef941088085\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.353403 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-catalog-content\") pod \"02a16a12-b591-464c-8f23-2ef941088085\" (UID: \"02a16a12-b591-464c-8f23-2ef941088085\") " Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.353554 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-utilities" (OuterVolumeSpecName: "utilities") pod "02a16a12-b591-464c-8f23-2ef941088085" (UID: "02a16a12-b591-464c-8f23-2ef941088085"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.354038 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.374581 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02a16a12-b591-464c-8f23-2ef941088085-kube-api-access-htkgp" (OuterVolumeSpecName: "kube-api-access-htkgp") pod "02a16a12-b591-464c-8f23-2ef941088085" (UID: "02a16a12-b591-464c-8f23-2ef941088085"). InnerVolumeSpecName "kube-api-access-htkgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.456087 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htkgp\" (UniqueName: \"kubernetes.io/projected/02a16a12-b591-464c-8f23-2ef941088085-kube-api-access-htkgp\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.465609 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fx7fc" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.466347 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fx7fc" event={"ID":"02a16a12-b591-464c-8f23-2ef941088085","Type":"ContainerDied","Data":"5aabf8b39220a12c41d22efb74e1e55c6032f3052f7f3d2cc0ea7491984079f2"} Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.466406 4691 scope.go:117] "RemoveContainer" containerID="dbb75be0ed20ae2c0e3a7e736ce78355db436b64ebd9f8623da41eeed7bea306" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.506838 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02a16a12-b591-464c-8f23-2ef941088085" (UID: "02a16a12-b591-464c-8f23-2ef941088085"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.520119 4691 scope.go:117] "RemoveContainer" containerID="b8a73e30b2b6178e6597200ab04643374f9a1255e9b6ecfef2b9b68e31f47853" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.546669 4691 scope.go:117] "RemoveContainer" containerID="ccc2e79f17b8e7fefdd86decfd09cc2ceda5075a4d5d0debc1b9bce358542af3" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.559015 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02a16a12-b591-464c-8f23-2ef941088085-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.790512 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fx7fc"] Dec 02 08:44:50 crc kubenswrapper[4691]: I1202 08:44:50.801341 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fx7fc"] Dec 02 08:44:51 crc kubenswrapper[4691]: I1202 08:44:51.390531 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hp5hq"] Dec 02 08:44:51 crc kubenswrapper[4691]: I1202 08:44:51.475172 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hp5hq" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="registry-server" containerID="cri-o://7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b" gracePeriod=2 Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.123750 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.318079 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-catalog-content\") pod \"2d97227e-a77e-4107-a3a1-24a46c600da3\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.318211 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc4lf\" (UniqueName: \"kubernetes.io/projected/2d97227e-a77e-4107-a3a1-24a46c600da3-kube-api-access-vc4lf\") pod \"2d97227e-a77e-4107-a3a1-24a46c600da3\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.318359 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-utilities\") pod \"2d97227e-a77e-4107-a3a1-24a46c600da3\" (UID: \"2d97227e-a77e-4107-a3a1-24a46c600da3\") " Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.319529 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-utilities" (OuterVolumeSpecName: "utilities") pod "2d97227e-a77e-4107-a3a1-24a46c600da3" (UID: "2d97227e-a77e-4107-a3a1-24a46c600da3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.324955 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d97227e-a77e-4107-a3a1-24a46c600da3-kube-api-access-vc4lf" (OuterVolumeSpecName: "kube-api-access-vc4lf") pod "2d97227e-a77e-4107-a3a1-24a46c600da3" (UID: "2d97227e-a77e-4107-a3a1-24a46c600da3"). InnerVolumeSpecName "kube-api-access-vc4lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.388631 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2d97227e-a77e-4107-a3a1-24a46c600da3" (UID: "2d97227e-a77e-4107-a3a1-24a46c600da3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.421651 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.421689 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc4lf\" (UniqueName: \"kubernetes.io/projected/2d97227e-a77e-4107-a3a1-24a46c600da3-kube-api-access-vc4lf\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.421712 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d97227e-a77e-4107-a3a1-24a46c600da3-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.491076 4691 generic.go:334] "Generic (PLEG): container finished" podID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerID="7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b" exitCode=0 Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.491133 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerDied","Data":"7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b"} Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.491183 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hp5hq" event={"ID":"2d97227e-a77e-4107-a3a1-24a46c600da3","Type":"ContainerDied","Data":"a1764e0e7a3b4b2219f4dfb991a3be3adf0cad37b20f7fc64d1be3b7388bb487"} Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.491202 4691 scope.go:117] "RemoveContainer" containerID="7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.491242 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hp5hq" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.516507 4691 scope.go:117] "RemoveContainer" containerID="fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.550788 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hp5hq"] Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.555402 4691 scope.go:117] "RemoveContainer" containerID="429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.582972 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02a16a12-b591-464c-8f23-2ef941088085" path="/var/lib/kubelet/pods/02a16a12-b591-464c-8f23-2ef941088085/volumes" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.585426 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hp5hq"] Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.607646 4691 scope.go:117] "RemoveContainer" containerID="7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b" Dec 02 08:44:52 crc kubenswrapper[4691]: E1202 08:44:52.608691 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b\": container with ID starting with 7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b not found: ID does not exist" containerID="7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.608772 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b"} err="failed to get container status \"7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b\": rpc error: code = NotFound desc = could not find container \"7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b\": container with ID starting with 7d2d2fe7e23330ceae21c4daa23806a0936f47313fc74d692fa6e7ee7ceecb6b not found: ID does not exist" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.608808 4691 scope.go:117] "RemoveContainer" containerID="fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac" Dec 02 08:44:52 crc kubenswrapper[4691]: E1202 08:44:52.609470 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac\": container with ID starting with fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac not found: ID does not exist" containerID="fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.609516 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac"} err="failed to get container status \"fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac\": rpc error: code = NotFound desc = could not find container \"fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac\": container with ID starting with fda1adb5b761c5fa1cb98e46898aa8ce492e978a22e97145e4d43783f550c3ac not found: ID does not exist" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.609543 4691 scope.go:117] "RemoveContainer" containerID="429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553" Dec 02 08:44:52 crc kubenswrapper[4691]: E1202 08:44:52.609910 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553\": container with ID starting with 429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553 not found: ID does not exist" containerID="429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553" Dec 02 08:44:52 crc kubenswrapper[4691]: I1202 08:44:52.609943 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553"} err="failed to get container status \"429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553\": rpc error: code = NotFound desc = could not find container \"429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553\": container with ID starting with 429ca05da6baf79798f6a12666018fc9cc74a63250df64d0223a1b1475620553 not found: ID does not exist" Dec 02 08:44:54 crc kubenswrapper[4691]: I1202 08:44:54.577916 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" path="/var/lib/kubelet/pods/2d97227e-a77e-4107-a3a1-24a46c600da3/volumes" Dec 02 08:44:56 crc kubenswrapper[4691]: I1202 08:44:56.532843 4691 generic.go:334] "Generic (PLEG): container finished" podID="ae401371-fff0-480a-9add-11063773f297" containerID="f03b1ae32240fa0cba1fc5277f16e5cadf63f30e69c1e3794665d9b0211e3961" exitCode=0 Dec 02 08:44:56 crc kubenswrapper[4691]: I1202 08:44:56.532927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" event={"ID":"ae401371-fff0-480a-9add-11063773f297","Type":"ContainerDied","Data":"f03b1ae32240fa0cba1fc5277f16e5cadf63f30e69c1e3794665d9b0211e3961"} Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.735273 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.764643 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-jp84m"] Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.772828 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-jp84m"] Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.929848 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae401371-fff0-480a-9add-11063773f297-host\") pod \"ae401371-fff0-480a-9add-11063773f297\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.929946 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ae401371-fff0-480a-9add-11063773f297-host" (OuterVolumeSpecName: "host") pod "ae401371-fff0-480a-9add-11063773f297" (UID: "ae401371-fff0-480a-9add-11063773f297"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.929978 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86c2n\" (UniqueName: \"kubernetes.io/projected/ae401371-fff0-480a-9add-11063773f297-kube-api-access-86c2n\") pod \"ae401371-fff0-480a-9add-11063773f297\" (UID: \"ae401371-fff0-480a-9add-11063773f297\") " Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.930375 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae401371-fff0-480a-9add-11063773f297-host\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:57 crc kubenswrapper[4691]: I1202 08:44:57.937489 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae401371-fff0-480a-9add-11063773f297-kube-api-access-86c2n" (OuterVolumeSpecName: "kube-api-access-86c2n") pod "ae401371-fff0-480a-9add-11063773f297" (UID: "ae401371-fff0-480a-9add-11063773f297"). InnerVolumeSpecName "kube-api-access-86c2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027026 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z8br6"] Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027537 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="extract-content" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027558 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="extract-content" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027578 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027587 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027600 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="extract-utilities" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027610 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="extract-utilities" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027632 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027640 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027649 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae401371-fff0-480a-9add-11063773f297" containerName="container-00" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027657 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae401371-fff0-480a-9add-11063773f297" containerName="container-00" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027678 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="extract-content" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027686 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="extract-content" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027697 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="extract-utilities" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027705 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="extract-utilities" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027715 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="extract-utilities" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027722 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="extract-utilities" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027738 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.027746 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: E1202 08:44:58.027778 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="extract-content" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.055851 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="extract-content" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.055948 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86c2n\" (UniqueName: \"kubernetes.io/projected/ae401371-fff0-480a-9add-11063773f297-kube-api-access-86c2n\") on node \"crc\" DevicePath \"\"" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.061656 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d97227e-a77e-4107-a3a1-24a46c600da3" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.061709 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae401371-fff0-480a-9add-11063773f297" containerName="container-00" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.061789 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5345727a-ec29-442f-a119-bad03d64d060" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.061864 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="02a16a12-b591-464c-8f23-2ef941088085" containerName="registry-server" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.085991 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.107356 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8br6"] Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.259608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-utilities\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.259778 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk2wd\" (UniqueName: \"kubernetes.io/projected/6a47899a-263b-4544-843e-d83808a0f4b9-kube-api-access-zk2wd\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.259865 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-catalog-content\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.361063 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-utilities\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.361560 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk2wd\" (UniqueName: \"kubernetes.io/projected/6a47899a-263b-4544-843e-d83808a0f4b9-kube-api-access-zk2wd\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.361629 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-catalog-content\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.361694 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-utilities\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.362052 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-catalog-content\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.387496 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk2wd\" (UniqueName: \"kubernetes.io/projected/6a47899a-263b-4544-843e-d83808a0f4b9-kube-api-access-zk2wd\") pod \"redhat-marketplace-z8br6\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.439790 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.627141 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae401371-fff0-480a-9add-11063773f297" path="/var/lib/kubelet/pods/ae401371-fff0-480a-9add-11063773f297/volumes" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.711937 4691 scope.go:117] "RemoveContainer" containerID="f03b1ae32240fa0cba1fc5277f16e5cadf63f30e69c1e3794665d9b0211e3961" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.713234 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-jp84m" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.961240 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-8r2pw"] Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.962575 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:58 crc kubenswrapper[4691]: I1202 08:44:58.964837 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4lkz8"/"default-dockercfg-9z8l4" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.066404 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8br6"] Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.117207 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b506060f-1551-4653-bc75-fd7b1324bd26-host\") pod \"crc-debug-8r2pw\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.117391 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2h9\" (UniqueName: \"kubernetes.io/projected/b506060f-1551-4653-bc75-fd7b1324bd26-kube-api-access-xx2h9\") pod \"crc-debug-8r2pw\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.223228 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2h9\" (UniqueName: \"kubernetes.io/projected/b506060f-1551-4653-bc75-fd7b1324bd26-kube-api-access-xx2h9\") pod \"crc-debug-8r2pw\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.223715 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b506060f-1551-4653-bc75-fd7b1324bd26-host\") pod \"crc-debug-8r2pw\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.223856 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b506060f-1551-4653-bc75-fd7b1324bd26-host\") pod \"crc-debug-8r2pw\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.245178 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2h9\" (UniqueName: \"kubernetes.io/projected/b506060f-1551-4653-bc75-fd7b1324bd26-kube-api-access-xx2h9\") pod \"crc-debug-8r2pw\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.281634 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:44:59 crc kubenswrapper[4691]: W1202 08:44:59.309131 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb506060f_1551_4653_bc75_fd7b1324bd26.slice/crio-3de06a86f7026ab1c5e3876912ac670f367970f1564aab6d9bad18e0d9c08edd WatchSource:0}: Error finding container 3de06a86f7026ab1c5e3876912ac670f367970f1564aab6d9bad18e0d9c08edd: Status 404 returned error can't find the container with id 3de06a86f7026ab1c5e3876912ac670f367970f1564aab6d9bad18e0d9c08edd Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.724486 4691 generic.go:334] "Generic (PLEG): container finished" podID="b506060f-1551-4653-bc75-fd7b1324bd26" containerID="89942a1041164342154c75761701a30ef6ed6c3d6c64615ba2520a018f577e74" exitCode=0 Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.724856 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" event={"ID":"b506060f-1551-4653-bc75-fd7b1324bd26","Type":"ContainerDied","Data":"89942a1041164342154c75761701a30ef6ed6c3d6c64615ba2520a018f577e74"} Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.724888 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" event={"ID":"b506060f-1551-4653-bc75-fd7b1324bd26","Type":"ContainerStarted","Data":"3de06a86f7026ab1c5e3876912ac670f367970f1564aab6d9bad18e0d9c08edd"} Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.728655 4691 generic.go:334] "Generic (PLEG): container finished" podID="6a47899a-263b-4544-843e-d83808a0f4b9" containerID="8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a" exitCode=0 Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.728682 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8br6" event={"ID":"6a47899a-263b-4544-843e-d83808a0f4b9","Type":"ContainerDied","Data":"8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a"} Dec 02 08:44:59 crc kubenswrapper[4691]: I1202 08:44:59.728700 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8br6" event={"ID":"6a47899a-263b-4544-843e-d83808a0f4b9","Type":"ContainerStarted","Data":"527cea460883da8e1c965ebb043b7414329a9c9de1889b15cbfa225295bf8fb3"} Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.631166 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd"] Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.632508 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd"] Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.632597 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.636528 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.636856 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.658326 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-8r2pw"] Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.673304 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-8r2pw"] Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.717640 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd4zd\" (UniqueName: \"kubernetes.io/projected/a8fbd51d-0032-4fcf-9064-619bfcc5b045-kube-api-access-hd4zd\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.718519 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fbd51d-0032-4fcf-9064-619bfcc5b045-secret-volume\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.718678 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fbd51d-0032-4fcf-9064-619bfcc5b045-config-volume\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.820914 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd4zd\" (UniqueName: \"kubernetes.io/projected/a8fbd51d-0032-4fcf-9064-619bfcc5b045-kube-api-access-hd4zd\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.821432 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fbd51d-0032-4fcf-9064-619bfcc5b045-secret-volume\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.821514 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fbd51d-0032-4fcf-9064-619bfcc5b045-config-volume\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.822693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fbd51d-0032-4fcf-9064-619bfcc5b045-config-volume\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.827582 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fbd51d-0032-4fcf-9064-619bfcc5b045-secret-volume\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.843274 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd4zd\" (UniqueName: \"kubernetes.io/projected/a8fbd51d-0032-4fcf-9064-619bfcc5b045-kube-api-access-hd4zd\") pod \"collect-profiles-29411085-p2vrd\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.902543 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.922816 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b506060f-1551-4653-bc75-fd7b1324bd26-host\") pod \"b506060f-1551-4653-bc75-fd7b1324bd26\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.922901 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx2h9\" (UniqueName: \"kubernetes.io/projected/b506060f-1551-4653-bc75-fd7b1324bd26-kube-api-access-xx2h9\") pod \"b506060f-1551-4653-bc75-fd7b1324bd26\" (UID: \"b506060f-1551-4653-bc75-fd7b1324bd26\") " Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.923023 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b506060f-1551-4653-bc75-fd7b1324bd26-host" (OuterVolumeSpecName: "host") pod "b506060f-1551-4653-bc75-fd7b1324bd26" (UID: "b506060f-1551-4653-bc75-fd7b1324bd26"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.923607 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b506060f-1551-4653-bc75-fd7b1324bd26-host\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.926637 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b506060f-1551-4653-bc75-fd7b1324bd26-kube-api-access-xx2h9" (OuterVolumeSpecName: "kube-api-access-xx2h9") pod "b506060f-1551-4653-bc75-fd7b1324bd26" (UID: "b506060f-1551-4653-bc75-fd7b1324bd26"). InnerVolumeSpecName "kube-api-access-xx2h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:00 crc kubenswrapper[4691]: I1202 08:45:00.964297 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.025067 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx2h9\" (UniqueName: \"kubernetes.io/projected/b506060f-1551-4653-bc75-fd7b1324bd26-kube-api-access-xx2h9\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.420411 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd"] Dec 02 08:45:01 crc kubenswrapper[4691]: W1202 08:45:01.422720 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8fbd51d_0032_4fcf_9064_619bfcc5b045.slice/crio-52a16241b0dc4a7c25a551511c2272480224021d15c3f387bf9bf39ab7cef39b WatchSource:0}: Error finding container 52a16241b0dc4a7c25a551511c2272480224021d15c3f387bf9bf39ab7cef39b: Status 404 returned error can't find the container with id 52a16241b0dc4a7c25a551511c2272480224021d15c3f387bf9bf39ab7cef39b Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.930236 4691 generic.go:334] "Generic (PLEG): container finished" podID="6a47899a-263b-4544-843e-d83808a0f4b9" containerID="4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0" exitCode=0 Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.930521 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8br6" event={"ID":"6a47899a-263b-4544-843e-d83808a0f4b9","Type":"ContainerDied","Data":"4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0"} Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.943505 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3de06a86f7026ab1c5e3876912ac670f367970f1564aab6d9bad18e0d9c08edd" Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.943590 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-8r2pw" Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.952139 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" event={"ID":"a8fbd51d-0032-4fcf-9064-619bfcc5b045","Type":"ContainerStarted","Data":"5e29f10a05862bd1395c9b87ab9b93c64e9c6cc3bdce4d61067d95c6d9187e59"} Dec 02 08:45:01 crc kubenswrapper[4691]: I1202 08:45:01.952182 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" event={"ID":"a8fbd51d-0032-4fcf-9064-619bfcc5b045","Type":"ContainerStarted","Data":"52a16241b0dc4a7c25a551511c2272480224021d15c3f387bf9bf39ab7cef39b"} Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.093227 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-x5f27"] Dec 02 08:45:02 crc kubenswrapper[4691]: E1202 08:45:02.093865 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b506060f-1551-4653-bc75-fd7b1324bd26" containerName="container-00" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.093885 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b506060f-1551-4653-bc75-fd7b1324bd26" containerName="container-00" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.094098 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b506060f-1551-4653-bc75-fd7b1324bd26" containerName="container-00" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.094845 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.097201 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-4lkz8"/"default-dockercfg-9z8l4" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.181597 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6783594-a1e3-4d8d-b22f-89328a76e124-host\") pod \"crc-debug-x5f27\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.181933 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skc84\" (UniqueName: \"kubernetes.io/projected/d6783594-a1e3-4d8d-b22f-89328a76e124-kube-api-access-skc84\") pod \"crc-debug-x5f27\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.284042 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6783594-a1e3-4d8d-b22f-89328a76e124-host\") pod \"crc-debug-x5f27\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.284168 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6783594-a1e3-4d8d-b22f-89328a76e124-host\") pod \"crc-debug-x5f27\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.284188 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skc84\" (UniqueName: \"kubernetes.io/projected/d6783594-a1e3-4d8d-b22f-89328a76e124-kube-api-access-skc84\") pod \"crc-debug-x5f27\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.304133 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skc84\" (UniqueName: \"kubernetes.io/projected/d6783594-a1e3-4d8d-b22f-89328a76e124-kube-api-access-skc84\") pod \"crc-debug-x5f27\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.411144 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:02 crc kubenswrapper[4691]: W1202 08:45:02.436146 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6783594_a1e3_4d8d_b22f_89328a76e124.slice/crio-ef8dfa37b678e18eb72029f838a47085a3bd5df7498df9f35a1df044ece63681 WatchSource:0}: Error finding container ef8dfa37b678e18eb72029f838a47085a3bd5df7498df9f35a1df044ece63681: Status 404 returned error can't find the container with id ef8dfa37b678e18eb72029f838a47085a3bd5df7498df9f35a1df044ece63681 Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.573098 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b506060f-1551-4653-bc75-fd7b1324bd26" path="/var/lib/kubelet/pods/b506060f-1551-4653-bc75-fd7b1324bd26/volumes" Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.965949 4691 generic.go:334] "Generic (PLEG): container finished" podID="a8fbd51d-0032-4fcf-9064-619bfcc5b045" containerID="5e29f10a05862bd1395c9b87ab9b93c64e9c6cc3bdce4d61067d95c6d9187e59" exitCode=0 Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.966060 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" event={"ID":"a8fbd51d-0032-4fcf-9064-619bfcc5b045","Type":"ContainerDied","Data":"5e29f10a05862bd1395c9b87ab9b93c64e9c6cc3bdce4d61067d95c6d9187e59"} Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.970120 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8br6" event={"ID":"6a47899a-263b-4544-843e-d83808a0f4b9","Type":"ContainerStarted","Data":"be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8"} Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.974096 4691 generic.go:334] "Generic (PLEG): container finished" podID="d6783594-a1e3-4d8d-b22f-89328a76e124" containerID="58489118d749d1d33c46a0fd0f995e6a632e0054107c76bc83a763c0f30804bb" exitCode=0 Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.974136 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-x5f27" event={"ID":"d6783594-a1e3-4d8d-b22f-89328a76e124","Type":"ContainerDied","Data":"58489118d749d1d33c46a0fd0f995e6a632e0054107c76bc83a763c0f30804bb"} Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.974161 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/crc-debug-x5f27" event={"ID":"d6783594-a1e3-4d8d-b22f-89328a76e124","Type":"ContainerStarted","Data":"ef8dfa37b678e18eb72029f838a47085a3bd5df7498df9f35a1df044ece63681"} Dec 02 08:45:02 crc kubenswrapper[4691]: I1202 08:45:02.996894 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z8br6" podStartSLOduration=3.161000353 podStartE2EDuration="5.99687211s" podCreationTimestamp="2025-12-02 08:44:57 +0000 UTC" firstStartedPulling="2025-12-02 08:44:59.730801047 +0000 UTC m=+3547.514879899" lastFinishedPulling="2025-12-02 08:45:02.566672794 +0000 UTC m=+3550.350751656" observedRunningTime="2025-12-02 08:45:02.989114204 +0000 UTC m=+3550.773193076" watchObservedRunningTime="2025-12-02 08:45:02.99687211 +0000 UTC m=+3550.780950972" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.040648 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-x5f27"] Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.049411 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4lkz8/crc-debug-x5f27"] Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.399860 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.408439 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd4zd\" (UniqueName: \"kubernetes.io/projected/a8fbd51d-0032-4fcf-9064-619bfcc5b045-kube-api-access-hd4zd\") pod \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.408815 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fbd51d-0032-4fcf-9064-619bfcc5b045-secret-volume\") pod \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.408911 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fbd51d-0032-4fcf-9064-619bfcc5b045-config-volume\") pod \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\" (UID: \"a8fbd51d-0032-4fcf-9064-619bfcc5b045\") " Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.409695 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8fbd51d-0032-4fcf-9064-619bfcc5b045-config-volume" (OuterVolumeSpecName: "config-volume") pod "a8fbd51d-0032-4fcf-9064-619bfcc5b045" (UID: "a8fbd51d-0032-4fcf-9064-619bfcc5b045"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.415110 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8fbd51d-0032-4fcf-9064-619bfcc5b045-kube-api-access-hd4zd" (OuterVolumeSpecName: "kube-api-access-hd4zd") pod "a8fbd51d-0032-4fcf-9064-619bfcc5b045" (UID: "a8fbd51d-0032-4fcf-9064-619bfcc5b045"). InnerVolumeSpecName "kube-api-access-hd4zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.415893 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8fbd51d-0032-4fcf-9064-619bfcc5b045-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a8fbd51d-0032-4fcf-9064-619bfcc5b045" (UID: "a8fbd51d-0032-4fcf-9064-619bfcc5b045"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.510327 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd4zd\" (UniqueName: \"kubernetes.io/projected/a8fbd51d-0032-4fcf-9064-619bfcc5b045-kube-api-access-hd4zd\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.510361 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a8fbd51d-0032-4fcf-9064-619bfcc5b045-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.510371 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a8fbd51d-0032-4fcf-9064-619bfcc5b045-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.985861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" event={"ID":"a8fbd51d-0032-4fcf-9064-619bfcc5b045","Type":"ContainerDied","Data":"52a16241b0dc4a7c25a551511c2272480224021d15c3f387bf9bf39ab7cef39b"} Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.985919 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52a16241b0dc4a7c25a551511c2272480224021d15c3f387bf9bf39ab7cef39b" Dec 02 08:45:03 crc kubenswrapper[4691]: I1202 08:45:03.985979 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411085-p2vrd" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.048204 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.222897 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skc84\" (UniqueName: \"kubernetes.io/projected/d6783594-a1e3-4d8d-b22f-89328a76e124-kube-api-access-skc84\") pod \"d6783594-a1e3-4d8d-b22f-89328a76e124\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.223091 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6783594-a1e3-4d8d-b22f-89328a76e124-host\") pod \"d6783594-a1e3-4d8d-b22f-89328a76e124\" (UID: \"d6783594-a1e3-4d8d-b22f-89328a76e124\") " Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.223663 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6783594-a1e3-4d8d-b22f-89328a76e124-host" (OuterVolumeSpecName: "host") pod "d6783594-a1e3-4d8d-b22f-89328a76e124" (UID: "d6783594-a1e3-4d8d-b22f-89328a76e124"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.231541 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6783594-a1e3-4d8d-b22f-89328a76e124-kube-api-access-skc84" (OuterVolumeSpecName: "kube-api-access-skc84") pod "d6783594-a1e3-4d8d-b22f-89328a76e124" (UID: "d6783594-a1e3-4d8d-b22f-89328a76e124"). InnerVolumeSpecName "kube-api-access-skc84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.325579 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skc84\" (UniqueName: \"kubernetes.io/projected/d6783594-a1e3-4d8d-b22f-89328a76e124-kube-api-access-skc84\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.325622 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6783594-a1e3-4d8d-b22f-89328a76e124-host\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.501171 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8"] Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.512227 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411040-tdpp8"] Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.579339 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c14266eb-cdfd-4665-a26b-f545bcbdff7d" path="/var/lib/kubelet/pods/c14266eb-cdfd-4665-a26b-f545bcbdff7d/volumes" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.582640 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6783594-a1e3-4d8d-b22f-89328a76e124" path="/var/lib/kubelet/pods/d6783594-a1e3-4d8d-b22f-89328a76e124/volumes" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.998931 4691 scope.go:117] "RemoveContainer" containerID="58489118d749d1d33c46a0fd0f995e6a632e0054107c76bc83a763c0f30804bb" Dec 02 08:45:04 crc kubenswrapper[4691]: I1202 08:45:04.999097 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/crc-debug-x5f27" Dec 02 08:45:08 crc kubenswrapper[4691]: I1202 08:45:08.440340 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:45:08 crc kubenswrapper[4691]: I1202 08:45:08.441091 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:45:08 crc kubenswrapper[4691]: I1202 08:45:08.502434 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:45:09 crc kubenswrapper[4691]: I1202 08:45:09.103179 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:45:09 crc kubenswrapper[4691]: I1202 08:45:09.151397 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8br6"] Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.067595 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z8br6" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="registry-server" containerID="cri-o://be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8" gracePeriod=2 Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.640918 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.669984 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-utilities\") pod \"6a47899a-263b-4544-843e-d83808a0f4b9\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.670031 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-catalog-content\") pod \"6a47899a-263b-4544-843e-d83808a0f4b9\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.670119 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk2wd\" (UniqueName: \"kubernetes.io/projected/6a47899a-263b-4544-843e-d83808a0f4b9-kube-api-access-zk2wd\") pod \"6a47899a-263b-4544-843e-d83808a0f4b9\" (UID: \"6a47899a-263b-4544-843e-d83808a0f4b9\") " Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.671258 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-utilities" (OuterVolumeSpecName: "utilities") pod "6a47899a-263b-4544-843e-d83808a0f4b9" (UID: "6a47899a-263b-4544-843e-d83808a0f4b9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.677991 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a47899a-263b-4544-843e-d83808a0f4b9-kube-api-access-zk2wd" (OuterVolumeSpecName: "kube-api-access-zk2wd") pod "6a47899a-263b-4544-843e-d83808a0f4b9" (UID: "6a47899a-263b-4544-843e-d83808a0f4b9"). InnerVolumeSpecName "kube-api-access-zk2wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.736311 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a47899a-263b-4544-843e-d83808a0f4b9" (UID: "6a47899a-263b-4544-843e-d83808a0f4b9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.814279 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.814314 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a47899a-263b-4544-843e-d83808a0f4b9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:11 crc kubenswrapper[4691]: I1202 08:45:11.814327 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk2wd\" (UniqueName: \"kubernetes.io/projected/6a47899a-263b-4544-843e-d83808a0f4b9-kube-api-access-zk2wd\") on node \"crc\" DevicePath \"\"" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.080905 4691 generic.go:334] "Generic (PLEG): container finished" podID="6a47899a-263b-4544-843e-d83808a0f4b9" containerID="be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8" exitCode=0 Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.080959 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8br6" event={"ID":"6a47899a-263b-4544-843e-d83808a0f4b9","Type":"ContainerDied","Data":"be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8"} Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.080977 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8br6" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.081013 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8br6" event={"ID":"6a47899a-263b-4544-843e-d83808a0f4b9","Type":"ContainerDied","Data":"527cea460883da8e1c965ebb043b7414329a9c9de1889b15cbfa225295bf8fb3"} Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.081042 4691 scope.go:117] "RemoveContainer" containerID="be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.103236 4691 scope.go:117] "RemoveContainer" containerID="4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.117674 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8br6"] Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.136619 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8br6"] Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.141973 4691 scope.go:117] "RemoveContainer" containerID="8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.174751 4691 scope.go:117] "RemoveContainer" containerID="be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8" Dec 02 08:45:12 crc kubenswrapper[4691]: E1202 08:45:12.176349 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8\": container with ID starting with be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8 not found: ID does not exist" containerID="be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.176381 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8"} err="failed to get container status \"be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8\": rpc error: code = NotFound desc = could not find container \"be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8\": container with ID starting with be79ec40a8f45dca93a9bb6397441930f374287b24e03462452439231a9533a8 not found: ID does not exist" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.176407 4691 scope.go:117] "RemoveContainer" containerID="4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0" Dec 02 08:45:12 crc kubenswrapper[4691]: E1202 08:45:12.176736 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0\": container with ID starting with 4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0 not found: ID does not exist" containerID="4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.176977 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0"} err="failed to get container status \"4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0\": rpc error: code = NotFound desc = could not find container \"4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0\": container with ID starting with 4f1ad9985488bcae667b31cb5618482a25303411a34bebefb94c9795cf3b8ea0 not found: ID does not exist" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.176992 4691 scope.go:117] "RemoveContainer" containerID="8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a" Dec 02 08:45:12 crc kubenswrapper[4691]: E1202 08:45:12.177230 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a\": container with ID starting with 8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a not found: ID does not exist" containerID="8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.177249 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a"} err="failed to get container status \"8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a\": rpc error: code = NotFound desc = could not find container \"8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a\": container with ID starting with 8a7cd047c65d74cf518c4f06b0e65b091b615a91d5f15ad4d2c3c1710792fe8a not found: ID does not exist" Dec 02 08:45:12 crc kubenswrapper[4691]: I1202 08:45:12.573163 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" path="/var/lib/kubelet/pods/6a47899a-263b-4544-843e-d83808a0f4b9/volumes" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.338313 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86bc664f5b-6lklq_ff11f2ca-96b6-4cd2-85b8-88916b74efc7/barbican-api/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.496493 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-598b69485d-n2fl7_d855ba5e-92ab-4a5e-b613-f49c9fec44b1/barbican-keystone-listener/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.506834 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86bc664f5b-6lklq_ff11f2ca-96b6-4cd2-85b8-88916b74efc7/barbican-api-log/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.589121 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-598b69485d-n2fl7_d855ba5e-92ab-4a5e-b613-f49c9fec44b1/barbican-keystone-listener-log/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.759948 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57597556c5-xzp56_3c13685c-ff97-4074-8bc8-5659d16ec95d/barbican-worker/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.785162 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57597556c5-xzp56_3c13685c-ff97-4074-8bc8-5659d16ec95d/barbican-worker-log/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.931078 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ws572_6eabed67-587a-402c-8d6f-02163a229356/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:20 crc kubenswrapper[4691]: I1202 08:45:20.990203 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/ceilometer-central-agent/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.087517 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/ceilometer-notification-agent/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.167368 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/proxy-httpd/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.234947 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/sg-core/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.333595 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0e3d72e5-726a-4f4b-a677-6237021e8747/cinder-api/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.397117 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0e3d72e5-726a-4f4b-a677-6237021e8747/cinder-api-log/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.514727 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd7d9765-5aa9-4f8d-af36-53dfbba7da81/cinder-scheduler/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.601961 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd7d9765-5aa9-4f8d-af36-53dfbba7da81/probe/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.757819 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq_9c3bebb2-7f42-4553-83b6-7fafbb022c70/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.854423 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lvg66_4fde2bba-1e5a-47a2-a918-8e57f11e6d95/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.899150 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.899284 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:45:21 crc kubenswrapper[4691]: I1202 08:45:21.972056 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-g4pgk_fa0cb344-97e1-42ae-867b-30322564459d/init/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.414425 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-g4pgk_fa0cb344-97e1-42ae-867b-30322564459d/init/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.463237 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh_90715156-30f9-4dfc-9c78-374f0a07bb4c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.479310 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-g4pgk_fa0cb344-97e1-42ae-867b-30322564459d/dnsmasq-dns/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.688948 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_23424898-747d-4eef-8f7e-ee64e1bf1070/glance-httpd/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.717905 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_23424898-747d-4eef-8f7e-ee64e1bf1070/glance-log/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.904368 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0837fb6a-ad2a-4110-bec4-727f9daa999c/glance-log/0.log" Dec 02 08:45:22 crc kubenswrapper[4691]: I1202 08:45:22.962553 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0837fb6a-ad2a-4110-bec4-727f9daa999c/glance-httpd/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.016926 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6585c7db4b-jz894_76022e0c-2dd2-4395-8607-aa13da42f557/horizon/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.220115 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt_80054e7f-3448-487b-8f0e-fc5eda159e57/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.403161 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6585c7db4b-jz894_76022e0c-2dd2-4395-8607-aa13da42f557/horizon-log/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.511834 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lx2xx_c4804dc1-5ac2-422e-87fe-71120becde69/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.650920 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55c7cfcf8b-8rs5b_f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd/keystone-api/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.785265 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6960ef77-d277-4be6-be89-446664dd7775/kube-state-metrics/0.log" Dec 02 08:45:23 crc kubenswrapper[4691]: I1202 08:45:23.909574 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb_d1c6d92a-1daf-4554-822b-1c946124e1d0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:24 crc kubenswrapper[4691]: I1202 08:45:24.316213 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95567dd97-rcpxr_22255ebb-1831-4d2e-966b-1ae2fee83ebf/neutron-api/0.log" Dec 02 08:45:24 crc kubenswrapper[4691]: I1202 08:45:24.347355 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95567dd97-rcpxr_22255ebb-1831-4d2e-966b-1ae2fee83ebf/neutron-httpd/0.log" Dec 02 08:45:24 crc kubenswrapper[4691]: I1202 08:45:24.374885 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp_c64d5e17-b659-47c6-aa5b-a62be849ee69/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:24 crc kubenswrapper[4691]: I1202 08:45:24.992731 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a12282a4-2fdb-4627-b2ff-06dbde0d2fdb/nova-api-log/0.log" Dec 02 08:45:25 crc kubenswrapper[4691]: I1202 08:45:25.047731 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437/nova-cell0-conductor-conductor/0.log" Dec 02 08:45:25 crc kubenswrapper[4691]: I1202 08:45:25.288513 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a12282a4-2fdb-4627-b2ff-06dbde0d2fdb/nova-api-api/0.log" Dec 02 08:45:25 crc kubenswrapper[4691]: I1202 08:45:25.317645 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e057af5a-bcd5-4612-9c03-350147146c52/nova-cell1-conductor-conductor/0.log" Dec 02 08:45:25 crc kubenswrapper[4691]: I1202 08:45:25.388510 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0dedf0d3-f3c7-4cb1-9003-8ac588994c43/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 08:45:25 crc kubenswrapper[4691]: I1202 08:45:25.939642 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cd1a81bc-6c1f-4caa-917a-900c527f0df5/nova-metadata-log/0.log" Dec 02 08:45:25 crc kubenswrapper[4691]: I1202 08:45:25.950556 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-vf7w6_baf13f0c-0ba4-4e4f-95cb-2de2f510801e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:26 crc kubenswrapper[4691]: I1202 08:45:26.282664 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f14bc2d4-ce0c-440d-9e1d-15b0b8716562/mysql-bootstrap/0.log" Dec 02 08:45:26 crc kubenswrapper[4691]: I1202 08:45:26.324846 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6ff86164-f22f-49e6-8933-e599da966506/nova-scheduler-scheduler/0.log" Dec 02 08:45:26 crc kubenswrapper[4691]: I1202 08:45:26.546288 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f14bc2d4-ce0c-440d-9e1d-15b0b8716562/galera/0.log" Dec 02 08:45:26 crc kubenswrapper[4691]: I1202 08:45:26.548341 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f14bc2d4-ce0c-440d-9e1d-15b0b8716562/mysql-bootstrap/0.log" Dec 02 08:45:26 crc kubenswrapper[4691]: I1202 08:45:26.786548 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aa4f9395-a46a-40e4-a80c-c9b43caadc0b/mysql-bootstrap/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.009804 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aa4f9395-a46a-40e4-a80c-c9b43caadc0b/galera/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.012094 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aa4f9395-a46a-40e4-a80c-c9b43caadc0b/mysql-bootstrap/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.191621 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7/openstackclient/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.287295 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cd1a81bc-6c1f-4caa-917a-900c527f0df5/nova-metadata-metadata/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.318128 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9jwww_ed3b8ff0-c19d-4614-abe4-0ad6b5801b78/ovn-controller/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.560358 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mjcrr_0cdf429e-b93d-4009-aaa1-1c45a0083363/openstack-network-exporter/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.563686 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovsdb-server-init/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.810563 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovsdb-server-init/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.828451 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovs-vswitchd/0.log" Dec 02 08:45:27 crc kubenswrapper[4691]: I1202 08:45:27.860951 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovsdb-server/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.098529 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23291b10-f1ed-4d19-9689-62bdf530e28e/openstack-network-exporter/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.119805 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-tkz82_87c36120-368b-47ab-baff-e007b39fc1d0/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.208424 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23291b10-f1ed-4d19-9689-62bdf530e28e/ovn-northd/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.384325 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4edb0266-7a9e-4e28-810c-7136d8336f1b/openstack-network-exporter/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.557320 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4edb0266-7a9e-4e28-810c-7136d8336f1b/ovsdbserver-nb/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.640071 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9dfe73d8-e7c6-4906-bb6c-64c13435c53f/openstack-network-exporter/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.695838 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9dfe73d8-e7c6-4906-bb6c-64c13435c53f/ovsdbserver-sb/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.968152 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-99bc7c96-6nbmb_1fd0d9c6-1443-4198-8319-642b450eecb8/placement-log/0.log" Dec 02 08:45:28 crc kubenswrapper[4691]: I1202 08:45:28.978218 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-99bc7c96-6nbmb_1fd0d9c6-1443-4198-8319-642b450eecb8/placement-api/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.088640 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0573471b-7d3a-484d-9195-87918928a753/setup-container/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.286671 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0573471b-7d3a-484d-9195-87918928a753/setup-container/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.299403 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0573471b-7d3a-484d-9195-87918928a753/rabbitmq/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.550175 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_178767a6-fba0-4c85-ab0c-0a3a1ffcc627/setup-container/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.784104 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_178767a6-fba0-4c85-ab0c-0a3a1ffcc627/setup-container/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.857418 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_178767a6-fba0-4c85-ab0c-0a3a1ffcc627/rabbitmq/0.log" Dec 02 08:45:29 crc kubenswrapper[4691]: I1202 08:45:29.967870 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx_31d7c220-1ece-46e7-bbe3-1737890c15e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.112379 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-c7mwj_34cc12b6-9f55-450f-b073-0e89d0889946/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.237356 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr_2a8c0a05-f1a6-4a5e-9598-9146f0074dc1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.327716 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nq6h7_1607b522-c05f-4f86-b8cb-79caa03799ed/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.594277 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-fsbvf_bfc83744-d9e3-4520-96ea-2ce6e382af39/ssh-known-hosts-edpm-deployment/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.722161 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76d578d5f5-hmcbw_de7d695d-6d9a-4de2-830e-579f9d496f08/proxy-httpd/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.733260 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76d578d5f5-hmcbw_de7d695d-6d9a-4de2-830e-579f9d496f08/proxy-server/0.log" Dec 02 08:45:30 crc kubenswrapper[4691]: I1202 08:45:30.933569 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lchdp_82672deb-2527-4d18-8006-0f794dfe97c0/swift-ring-rebalance/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.013559 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-auditor/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.135290 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-reaper/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.229919 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-server/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.258271 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-auditor/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.258497 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-replicator/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.376167 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-replicator/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.479393 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-auditor/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.483009 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-server/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.496298 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-updater/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.600017 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-expirer/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.703996 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-replicator/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.704137 4691 scope.go:117] "RemoveContainer" containerID="cf02052188f0ab04cd72818b1ca0b7f542f6428abd5b10ddff25c96810ded2ea" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.737948 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-updater/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.780252 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-server/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.935274 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/swift-recon-cron/0.log" Dec 02 08:45:31 crc kubenswrapper[4691]: I1202 08:45:31.935313 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/rsync/0.log" Dec 02 08:45:32 crc kubenswrapper[4691]: I1202 08:45:32.098390 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg_a5fcdaa5-c1a6-4f23-b953-0d31524ee62f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:32 crc kubenswrapper[4691]: I1202 08:45:32.246842 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0d635e45-a63d-4661-9b82-b21d8ce59623/tempest-tests-tempest-tests-runner/0.log" Dec 02 08:45:32 crc kubenswrapper[4691]: I1202 08:45:32.420318 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0e4e77b4-f638-453e-9408-61dae4d0f68a/test-operator-logs-container/0.log" Dec 02 08:45:32 crc kubenswrapper[4691]: I1202 08:45:32.520466 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-67hl6_5f7ee74e-e2c8-4144-9643-4df288709175/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:45:38 crc kubenswrapper[4691]: I1202 08:45:38.531497 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b3c7c69c-4fd9-4483-89b7-202f766ce6e5/memcached/0.log" Dec 02 08:45:51 crc kubenswrapper[4691]: I1202 08:45:51.898718 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:45:51 crc kubenswrapper[4691]: I1202 08:45:51.899423 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.221707 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-w9hbv_5369081c-2142-4dfa-9482-b8d8d6d4195f/kube-rbac-proxy/0.log" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.351656 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-w9hbv_5369081c-2142-4dfa-9482-b8d8d6d4195f/manager/0.log" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.425899 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-66zbg_e88d7782-bcf8-4d40-aa1c-269533471279/kube-rbac-proxy/0.log" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.545386 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-66zbg_e88d7782-bcf8-4d40-aa1c-269533471279/manager/0.log" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.640998 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-c7ftc_9e639d67-2200-474e-9be7-55bef7c97fe6/kube-rbac-proxy/0.log" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.723954 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-c7ftc_9e639d67-2200-474e-9be7-55bef7c97fe6/manager/0.log" Dec 02 08:45:58 crc kubenswrapper[4691]: I1202 08:45:58.773482 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/util/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.016318 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/pull/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.018420 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/pull/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.063873 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/util/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.219195 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/pull/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.226154 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/util/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.272619 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/extract/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.441379 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-n5f6s_f2666d2b-c30c-40d4-bfab-0e6d00571ecc/kube-rbac-proxy/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.517457 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qltbw_0361226c-435b-4221-b59d-74900b2552e1/kube-rbac-proxy/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.528452 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-n5f6s_f2666d2b-c30c-40d4-bfab-0e6d00571ecc/manager/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.698316 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qltbw_0361226c-435b-4221-b59d-74900b2552e1/manager/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.755921 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-d898r_c92e2d03-8848-432f-82f4-fd28b3b0fa34/manager/0.log" Dec 02 08:45:59 crc kubenswrapper[4691]: I1202 08:45:59.775167 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-d898r_c92e2d03-8848-432f-82f4-fd28b3b0fa34/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.025468 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-n5t7p_b930ff47-307d-47b3-9b84-54e5860ee2db/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.117496 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-n5t7p_b930ff47-307d-47b3-9b84-54e5860ee2db/manager/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.133194 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-p2dmm_a1687ef3-9bf5-451e-aa8a-22ede53d9ed9/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.205928 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-p2dmm_a1687ef3-9bf5-451e-aa8a-22ede53d9ed9/manager/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.291996 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-44twc_759a905a-dc61-4206-862f-cb8b6f85882f/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.390232 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-44twc_759a905a-dc61-4206-862f-cb8b6f85882f/manager/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.490210 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-28bs2_36d938bc-e3d6-4f21-8327-5f655a4ef54a/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.522606 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-28bs2_36d938bc-e3d6-4f21-8327-5f655a4ef54a/manager/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.629546 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-lhkdn_0667dbf1-e305-4d12-af7b-3d532a834609/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.695331 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-lhkdn_0667dbf1-e305-4d12-af7b-3d532a834609/manager/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.782686 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-6tdpt_f4783498-99ba-42cc-9312-8e8c6b279e5a/kube-rbac-proxy/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.905357 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-6tdpt_f4783498-99ba-42cc-9312-8e8c6b279e5a/manager/0.log" Dec 02 08:46:00 crc kubenswrapper[4691]: I1202 08:46:00.957033 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-2g56f_d82caf6e-e4a7-4474-8cd4-6d3f554ce608/kube-rbac-proxy/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.159223 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9p97b_e8014abc-9e31-40d3-8e34-d595a8ef95b4/manager/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.213891 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9p97b_e8014abc-9e31-40d3-8e34-d595a8ef95b4/kube-rbac-proxy/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.331167 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-2g56f_d82caf6e-e4a7-4474-8cd4-6d3f554ce608/manager/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.343111 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v_e6bec2f4-8aea-472b-a0f9-591b744f9fe4/kube-rbac-proxy/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.436034 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v_e6bec2f4-8aea-472b-a0f9-591b744f9fe4/manager/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.811039 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f2vhc_22ecfc20-6eb6-417f-ab17-8fc55057d5af/registry-server/0.log" Dec 02 08:46:01 crc kubenswrapper[4691]: I1202 08:46:01.872320 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f9cf644cb-bqpg5_e36ff06e-9c96-433a-88fc-14c6941566ee/operator/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.130203 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gbqw2_09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b/kube-rbac-proxy/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.242097 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gbqw2_09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b/manager/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.342790 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nlwnv_fd33241e-270e-4dbe-b024-e368b2050ece/kube-rbac-proxy/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.361707 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nlwnv_fd33241e-270e-4dbe-b024-e368b2050ece/manager/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.543201 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2kbsh_b42f21ed-f361-4b6f-abc7-03b237501f65/operator/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.601002 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-slsfn_dfa342c0-60d5-4025-a881-30d706833e2b/kube-rbac-proxy/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.871895 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85b7b84db-d9nql_e837878c-4a1f-463b-913c-7df163c5ba27/manager/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.874302 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-slsfn_dfa342c0-60d5-4025-a881-30d706833e2b/manager/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.900502 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-666pb_1a9a40d0-4970-4551-b68e-a9697f250e94/manager/0.log" Dec 02 08:46:02 crc kubenswrapper[4691]: I1202 08:46:02.924502 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-666pb_1a9a40d0-4970-4551-b68e-a9697f250e94/kube-rbac-proxy/0.log" Dec 02 08:46:03 crc kubenswrapper[4691]: I1202 08:46:03.130829 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wnzqd_4a080540-9871-4b8c-9d74-e3f1f3cf317c/kube-rbac-proxy/0.log" Dec 02 08:46:03 crc kubenswrapper[4691]: I1202 08:46:03.151338 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wnzqd_4a080540-9871-4b8c-9d74-e3f1f3cf317c/manager/0.log" Dec 02 08:46:03 crc kubenswrapper[4691]: I1202 08:46:03.169811 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4k8xk_f75595fc-314d-4aa9-bc60-e82c16361768/kube-rbac-proxy/0.log" Dec 02 08:46:03 crc kubenswrapper[4691]: I1202 08:46:03.353168 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4k8xk_f75595fc-314d-4aa9-bc60-e82c16361768/manager/0.log" Dec 02 08:46:21 crc kubenswrapper[4691]: I1202 08:46:21.898824 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:46:21 crc kubenswrapper[4691]: I1202 08:46:21.899317 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:46:21 crc kubenswrapper[4691]: I1202 08:46:21.899375 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:46:21 crc kubenswrapper[4691]: I1202 08:46:21.899941 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6047dc0d1f247765ff138c50dd8d084fbe9f7fca33383efa09c95136136de241"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:46:21 crc kubenswrapper[4691]: I1202 08:46:21.900012 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://6047dc0d1f247765ff138c50dd8d084fbe9f7fca33383efa09c95136136de241" gracePeriod=600 Dec 02 08:46:22 crc kubenswrapper[4691]: I1202 08:46:22.050289 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="6047dc0d1f247765ff138c50dd8d084fbe9f7fca33383efa09c95136136de241" exitCode=0 Dec 02 08:46:22 crc kubenswrapper[4691]: I1202 08:46:22.050522 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"6047dc0d1f247765ff138c50dd8d084fbe9f7fca33383efa09c95136136de241"} Dec 02 08:46:22 crc kubenswrapper[4691]: I1202 08:46:22.050744 4691 scope.go:117] "RemoveContainer" containerID="62e8ec06c16dc1aaab6aad7319f4f555bc3cd63ae37ec136754b4406e209dce9" Dec 02 08:46:23 crc kubenswrapper[4691]: I1202 08:46:23.225745 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4"} Dec 02 08:46:24 crc kubenswrapper[4691]: I1202 08:46:24.170140 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8x9pv_39b63660-5845-41eb-94ca-ffc8ccb34413/control-plane-machine-set-operator/0.log" Dec 02 08:46:24 crc kubenswrapper[4691]: I1202 08:46:24.375596 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qvdbg_677bf100-9036-4b58-9658-6b918304ba47/kube-rbac-proxy/0.log" Dec 02 08:46:24 crc kubenswrapper[4691]: I1202 08:46:24.376018 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qvdbg_677bf100-9036-4b58-9658-6b918304ba47/machine-api-operator/0.log" Dec 02 08:46:37 crc kubenswrapper[4691]: I1202 08:46:37.895710 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-6jm2z_d29342d1-9924-4226-ae63-6e405a469f70/cert-manager-controller/0.log" Dec 02 08:46:38 crc kubenswrapper[4691]: I1202 08:46:38.095710 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-bjslb_e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de/cert-manager-cainjector/0.log" Dec 02 08:46:38 crc kubenswrapper[4691]: I1202 08:46:38.130303 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5gp8z_ee012121-3006-49dc-9f6d-349cdbc940a1/cert-manager-webhook/0.log" Dec 02 08:46:51 crc kubenswrapper[4691]: I1202 08:46:51.304511 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-wqdhp_e4e7cee0-91cd-406a-a496-a13b4ee91e1e/nmstate-console-plugin/0.log" Dec 02 08:46:51 crc kubenswrapper[4691]: I1202 08:46:51.510551 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7scpl_c0ce4e86-7cec-4db1-975d-51ee41f94337/nmstate-handler/0.log" Dec 02 08:46:51 crc kubenswrapper[4691]: I1202 08:46:51.562107 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-f2q6n_dec2c181-9b51-4e2b-95d1-d98fa9102b3a/kube-rbac-proxy/0.log" Dec 02 08:46:51 crc kubenswrapper[4691]: I1202 08:46:51.588260 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-f2q6n_dec2c181-9b51-4e2b-95d1-d98fa9102b3a/nmstate-metrics/0.log" Dec 02 08:46:51 crc kubenswrapper[4691]: I1202 08:46:51.757246 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-mztlh_28cc1c9c-b36a-4783-ba53-4504f085b70d/nmstate-operator/0.log" Dec 02 08:46:51 crc kubenswrapper[4691]: I1202 08:46:51.835194 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8slld_ef02c668-a715-48cb-8efb-9c52cdb28e9d/nmstate-webhook/0.log" Dec 02 08:47:06 crc kubenswrapper[4691]: I1202 08:47:06.933332 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-zj6r5_d016272b-85ec-410a-9392-050c9c0a5ff1/kube-rbac-proxy/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.086596 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-zj6r5_d016272b-85ec-410a-9392-050c9c0a5ff1/controller/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.187658 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.360671 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.362503 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.385782 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.393803 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.618569 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.633476 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.667148 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.690154 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:47:07 crc kubenswrapper[4691]: I1202 08:47:07.849574 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.097903 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/controller/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.105364 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.113400 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.353688 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/frr-metrics/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.363779 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/kube-rbac-proxy-frr/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.399557 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/kube-rbac-proxy/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.575187 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/reloader/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.621835 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-7vgpc_14bf1d46-7291-4940-9d53-9361142ad142/frr-k8s-webhook-server/0.log" Dec 02 08:47:08 crc kubenswrapper[4691]: I1202 08:47:08.876140 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-568f45db7d-dr99j_f921d74e-40cb-430a-a228-ec4681e9251d/manager/0.log" Dec 02 08:47:09 crc kubenswrapper[4691]: I1202 08:47:09.068167 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-656d54dc75-wplwb_0d3a05b7-702e-4564-9014-edffe6fc64ea/webhook-server/0.log" Dec 02 08:47:09 crc kubenswrapper[4691]: I1202 08:47:09.150493 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv29g_defc2342-4163-4319-afaa-fa2eb042082c/kube-rbac-proxy/0.log" Dec 02 08:47:09 crc kubenswrapper[4691]: I1202 08:47:09.677063 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/frr/0.log" Dec 02 08:47:09 crc kubenswrapper[4691]: I1202 08:47:09.787075 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv29g_defc2342-4163-4319-afaa-fa2eb042082c/speaker/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.217420 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/util/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.449414 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/util/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.462548 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/pull/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.462801 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/pull/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.684417 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/util/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.690953 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/pull/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.691295 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/extract/0.log" Dec 02 08:47:21 crc kubenswrapper[4691]: I1202 08:47:21.889306 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/util/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.051809 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/util/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.056478 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/pull/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.080353 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/pull/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.249562 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/pull/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.251261 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/util/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.316172 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/extract/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.445274 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-utilities/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.610929 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-content/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.639539 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-content/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.651847 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-utilities/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.837441 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-utilities/0.log" Dec 02 08:47:22 crc kubenswrapper[4691]: I1202 08:47:22.892003 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-content/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.124627 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-utilities/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.282521 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-utilities/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.323840 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-content/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.366308 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-content/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.474878 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/registry-server/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.581394 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-content/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.606339 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-utilities/0.log" Dec 02 08:47:23 crc kubenswrapper[4691]: I1202 08:47:23.825743 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kb4km_411ccc93-fc18-44f5-b96f-f2da874ae9be/marketplace-operator/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.004413 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-utilities/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.128573 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/registry-server/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.161893 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-utilities/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.225023 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-content/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.232710 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-content/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.363844 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-utilities/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.416752 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-content/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.556944 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/registry-server/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.598485 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-utilities/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.818099 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-content/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.830071 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-utilities/0.log" Dec 02 08:47:24 crc kubenswrapper[4691]: I1202 08:47:24.845713 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-content/0.log" Dec 02 08:47:25 crc kubenswrapper[4691]: I1202 08:47:25.014842 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-utilities/0.log" Dec 02 08:47:25 crc kubenswrapper[4691]: I1202 08:47:25.052526 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-content/0.log" Dec 02 08:47:25 crc kubenswrapper[4691]: I1202 08:47:25.566899 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/registry-server/0.log" Dec 02 08:47:46 crc kubenswrapper[4691]: E1202 08:47:46.984961 4691 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.222:37580->38.102.83.222:41995: read tcp 38.102.83.222:37580->38.102.83.222:41995: read: connection reset by peer Dec 02 08:48:51 crc kubenswrapper[4691]: I1202 08:48:51.898912 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:48:51 crc kubenswrapper[4691]: I1202 08:48:51.899454 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:49:15 crc kubenswrapper[4691]: I1202 08:49:15.138512 4691 generic.go:334] "Generic (PLEG): container finished" podID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerID="286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1" exitCode=0 Dec 02 08:49:15 crc kubenswrapper[4691]: I1202 08:49:15.138620 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" event={"ID":"0f74c8c7-5577-4b38-824e-0ef73b775f64","Type":"ContainerDied","Data":"286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1"} Dec 02 08:49:15 crc kubenswrapper[4691]: I1202 08:49:15.140018 4691 scope.go:117] "RemoveContainer" containerID="286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1" Dec 02 08:49:15 crc kubenswrapper[4691]: I1202 08:49:15.222560 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4lkz8_must-gather-jc7xh_0f74c8c7-5577-4b38-824e-0ef73b775f64/gather/0.log" Dec 02 08:49:17 crc kubenswrapper[4691]: E1202 08:49:17.417496 4691 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.222:41154->38.102.83.222:41995: write tcp 38.102.83.222:41154->38.102.83.222:41995: write: broken pipe Dec 02 08:49:21 crc kubenswrapper[4691]: I1202 08:49:21.898311 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:49:21 crc kubenswrapper[4691]: I1202 08:49:21.899200 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:49:22 crc kubenswrapper[4691]: I1202 08:49:22.585180 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-4lkz8/must-gather-jc7xh"] Dec 02 08:49:22 crc kubenswrapper[4691]: I1202 08:49:22.585479 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="copy" containerID="cri-o://2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129" gracePeriod=2 Dec 02 08:49:22 crc kubenswrapper[4691]: I1202 08:49:22.656113 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-4lkz8/must-gather-jc7xh"] Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.079951 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4lkz8_must-gather-jc7xh_0f74c8c7-5577-4b38-824e-0ef73b775f64/copy/0.log" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.081612 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.213867 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f74c8c7-5577-4b38-824e-0ef73b775f64-must-gather-output\") pod \"0f74c8c7-5577-4b38-824e-0ef73b775f64\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.213924 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bz562\" (UniqueName: \"kubernetes.io/projected/0f74c8c7-5577-4b38-824e-0ef73b775f64-kube-api-access-bz562\") pod \"0f74c8c7-5577-4b38-824e-0ef73b775f64\" (UID: \"0f74c8c7-5577-4b38-824e-0ef73b775f64\") " Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.241187 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-4lkz8_must-gather-jc7xh_0f74c8c7-5577-4b38-824e-0ef73b775f64/copy/0.log" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.241204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f74c8c7-5577-4b38-824e-0ef73b775f64-kube-api-access-bz562" (OuterVolumeSpecName: "kube-api-access-bz562") pod "0f74c8c7-5577-4b38-824e-0ef73b775f64" (UID: "0f74c8c7-5577-4b38-824e-0ef73b775f64"). InnerVolumeSpecName "kube-api-access-bz562". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.256305 4691 generic.go:334] "Generic (PLEG): container finished" podID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerID="2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129" exitCode=143 Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.256379 4691 scope.go:117] "RemoveContainer" containerID="2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.256567 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-4lkz8/must-gather-jc7xh" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.319465 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bz562\" (UniqueName: \"kubernetes.io/projected/0f74c8c7-5577-4b38-824e-0ef73b775f64-kube-api-access-bz562\") on node \"crc\" DevicePath \"\"" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.329927 4691 scope.go:117] "RemoveContainer" containerID="286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.398156 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f74c8c7-5577-4b38-824e-0ef73b775f64-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0f74c8c7-5577-4b38-824e-0ef73b775f64" (UID: "0f74c8c7-5577-4b38-824e-0ef73b775f64"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.417918 4691 scope.go:117] "RemoveContainer" containerID="2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129" Dec 02 08:49:23 crc kubenswrapper[4691]: E1202 08:49:23.418413 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129\": container with ID starting with 2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129 not found: ID does not exist" containerID="2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.418446 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129"} err="failed to get container status \"2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129\": rpc error: code = NotFound desc = could not find container \"2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129\": container with ID starting with 2aae764f483f977dadf5e47ec8cb84d3fdc965c7ea50c0914b288a6fa2275129 not found: ID does not exist" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.418500 4691 scope.go:117] "RemoveContainer" containerID="286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1" Dec 02 08:49:23 crc kubenswrapper[4691]: E1202 08:49:23.420118 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1\": container with ID starting with 286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1 not found: ID does not exist" containerID="286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.420206 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1"} err="failed to get container status \"286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1\": rpc error: code = NotFound desc = could not find container \"286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1\": container with ID starting with 286df41f1b6e80ad0d74bc2bdc86e09c35cdce6a3dbfa9e8fcf4c6a60ac615b1 not found: ID does not exist" Dec 02 08:49:23 crc kubenswrapper[4691]: I1202 08:49:23.421636 4691 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0f74c8c7-5577-4b38-824e-0ef73b775f64-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 08:49:24 crc kubenswrapper[4691]: I1202 08:49:24.578391 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" path="/var/lib/kubelet/pods/0f74c8c7-5577-4b38-824e-0ef73b775f64/volumes" Dec 02 08:49:51 crc kubenswrapper[4691]: I1202 08:49:51.899071 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:49:51 crc kubenswrapper[4691]: I1202 08:49:51.899834 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:49:51 crc kubenswrapper[4691]: I1202 08:49:51.899895 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:49:51 crc kubenswrapper[4691]: I1202 08:49:51.900778 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:49:51 crc kubenswrapper[4691]: I1202 08:49:51.900840 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" gracePeriod=600 Dec 02 08:49:52 crc kubenswrapper[4691]: E1202 08:49:52.530934 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:49:52 crc kubenswrapper[4691]: I1202 08:49:52.559584 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" exitCode=0 Dec 02 08:49:52 crc kubenswrapper[4691]: I1202 08:49:52.559652 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4"} Dec 02 08:49:52 crc kubenswrapper[4691]: I1202 08:49:52.559705 4691 scope.go:117] "RemoveContainer" containerID="6047dc0d1f247765ff138c50dd8d084fbe9f7fca33383efa09c95136136de241" Dec 02 08:49:52 crc kubenswrapper[4691]: I1202 08:49:52.560429 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:49:52 crc kubenswrapper[4691]: E1202 08:49:52.560735 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:50:06 crc kubenswrapper[4691]: I1202 08:50:06.562294 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:50:06 crc kubenswrapper[4691]: E1202 08:50:06.564183 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:50:20 crc kubenswrapper[4691]: I1202 08:50:20.562275 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:50:20 crc kubenswrapper[4691]: E1202 08:50:20.563206 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:50:32 crc kubenswrapper[4691]: I1202 08:50:32.590726 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:50:32 crc kubenswrapper[4691]: E1202 08:50:32.591606 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:50:43 crc kubenswrapper[4691]: I1202 08:50:43.561869 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:50:43 crc kubenswrapper[4691]: E1202 08:50:43.563012 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:50:57 crc kubenswrapper[4691]: I1202 08:50:57.561835 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:50:57 crc kubenswrapper[4691]: E1202 08:50:57.562866 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:51:08 crc kubenswrapper[4691]: I1202 08:51:08.561968 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:51:08 crc kubenswrapper[4691]: E1202 08:51:08.563033 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:51:20 crc kubenswrapper[4691]: I1202 08:51:20.562495 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:51:20 crc kubenswrapper[4691]: E1202 08:51:20.563378 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:51:32 crc kubenswrapper[4691]: I1202 08:51:32.423428 4691 scope.go:117] "RemoveContainer" containerID="89942a1041164342154c75761701a30ef6ed6c3d6c64615ba2520a018f577e74" Dec 02 08:51:34 crc kubenswrapper[4691]: I1202 08:51:34.562185 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:51:34 crc kubenswrapper[4691]: E1202 08:51:34.562854 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:51:45 crc kubenswrapper[4691]: I1202 08:51:45.562574 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:51:45 crc kubenswrapper[4691]: E1202 08:51:45.563582 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:51:59 crc kubenswrapper[4691]: I1202 08:51:59.562687 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:51:59 crc kubenswrapper[4691]: E1202 08:51:59.563819 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:52:14 crc kubenswrapper[4691]: I1202 08:52:14.562902 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:52:14 crc kubenswrapper[4691]: E1202 08:52:14.564179 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.395994 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bxqm7/must-gather-92sjw"] Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397438 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="gather" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397461 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="gather" Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397511 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6783594-a1e3-4d8d-b22f-89328a76e124" containerName="container-00" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397522 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6783594-a1e3-4d8d-b22f-89328a76e124" containerName="container-00" Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397545 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="extract-utilities" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397554 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="extract-utilities" Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397572 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="copy" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397579 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="copy" Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397593 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8fbd51d-0032-4fcf-9064-619bfcc5b045" containerName="collect-profiles" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397601 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8fbd51d-0032-4fcf-9064-619bfcc5b045" containerName="collect-profiles" Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397622 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="extract-content" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397631 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="extract-content" Dec 02 08:52:22 crc kubenswrapper[4691]: E1202 08:52:22.397651 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="registry-server" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397659 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="registry-server" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397901 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="gather" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397919 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6783594-a1e3-4d8d-b22f-89328a76e124" containerName="container-00" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397939 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f74c8c7-5577-4b38-824e-0ef73b775f64" containerName="copy" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397946 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8fbd51d-0032-4fcf-9064-619bfcc5b045" containerName="collect-profiles" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.397971 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a47899a-263b-4544-843e-d83808a0f4b9" containerName="registry-server" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.399653 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.402338 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bxqm7"/"openshift-service-ca.crt" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.402870 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-bxqm7"/"kube-root-ca.crt" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.403641 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-bxqm7"/"default-dockercfg-twkdc" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.412285 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bxqm7/must-gather-92sjw"] Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.562996 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wxc4\" (UniqueName: \"kubernetes.io/projected/55555f90-c926-494d-8b43-e7099f85c550-kube-api-access-7wxc4\") pod \"must-gather-92sjw\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.563099 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55555f90-c926-494d-8b43-e7099f85c550-must-gather-output\") pod \"must-gather-92sjw\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.665649 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55555f90-c926-494d-8b43-e7099f85c550-must-gather-output\") pod \"must-gather-92sjw\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.665959 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wxc4\" (UniqueName: \"kubernetes.io/projected/55555f90-c926-494d-8b43-e7099f85c550-kube-api-access-7wxc4\") pod \"must-gather-92sjw\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.666868 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55555f90-c926-494d-8b43-e7099f85c550-must-gather-output\") pod \"must-gather-92sjw\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.687630 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wxc4\" (UniqueName: \"kubernetes.io/projected/55555f90-c926-494d-8b43-e7099f85c550-kube-api-access-7wxc4\") pod \"must-gather-92sjw\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:22 crc kubenswrapper[4691]: I1202 08:52:22.750935 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:52:23 crc kubenswrapper[4691]: I1202 08:52:23.224089 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bxqm7/must-gather-92sjw"] Dec 02 08:52:23 crc kubenswrapper[4691]: I1202 08:52:23.489615 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/must-gather-92sjw" event={"ID":"55555f90-c926-494d-8b43-e7099f85c550","Type":"ContainerStarted","Data":"ed5d0b10cf0921f9c85515aea09741b9a5968ba4861a5ec32358e4850187c18e"} Dec 02 08:52:24 crc kubenswrapper[4691]: I1202 08:52:24.504077 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/must-gather-92sjw" event={"ID":"55555f90-c926-494d-8b43-e7099f85c550","Type":"ContainerStarted","Data":"54326bcba51fbdc52d86cdbd3e63bfad3baf85845d97507d014d141b9dbb536b"} Dec 02 08:52:24 crc kubenswrapper[4691]: I1202 08:52:24.504499 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/must-gather-92sjw" event={"ID":"55555f90-c926-494d-8b43-e7099f85c550","Type":"ContainerStarted","Data":"898066b8a1ffe22516b6bda6441cced35f660c3172d46412ffbe772d76ab60ae"} Dec 02 08:52:24 crc kubenswrapper[4691]: I1202 08:52:24.529131 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bxqm7/must-gather-92sjw" podStartSLOduration=2.529106929 podStartE2EDuration="2.529106929s" podCreationTimestamp="2025-12-02 08:52:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:52:24.525317884 +0000 UTC m=+3992.309396756" watchObservedRunningTime="2025-12-02 08:52:24.529106929 +0000 UTC m=+3992.313185791" Dec 02 08:52:26 crc kubenswrapper[4691]: I1202 08:52:26.562155 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:52:26 crc kubenswrapper[4691]: E1202 08:52:26.562801 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.522093 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-qlq4z"] Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.526842 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.610939 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck6b9\" (UniqueName: \"kubernetes.io/projected/fe266bbb-b647-413b-a40a-23a99dab1d3d-kube-api-access-ck6b9\") pod \"crc-debug-qlq4z\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.610989 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe266bbb-b647-413b-a40a-23a99dab1d3d-host\") pod \"crc-debug-qlq4z\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.712835 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck6b9\" (UniqueName: \"kubernetes.io/projected/fe266bbb-b647-413b-a40a-23a99dab1d3d-kube-api-access-ck6b9\") pod \"crc-debug-qlq4z\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.712904 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe266bbb-b647-413b-a40a-23a99dab1d3d-host\") pod \"crc-debug-qlq4z\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.713028 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe266bbb-b647-413b-a40a-23a99dab1d3d-host\") pod \"crc-debug-qlq4z\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.736211 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck6b9\" (UniqueName: \"kubernetes.io/projected/fe266bbb-b647-413b-a40a-23a99dab1d3d-kube-api-access-ck6b9\") pod \"crc-debug-qlq4z\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:28 crc kubenswrapper[4691]: I1202 08:52:28.867436 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:52:29 crc kubenswrapper[4691]: I1202 08:52:29.569210 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" event={"ID":"fe266bbb-b647-413b-a40a-23a99dab1d3d","Type":"ContainerStarted","Data":"55b7349b45b2709c80511d201e41ee6fd13bbbfe1b359c26ed1396ca3822acee"} Dec 02 08:52:29 crc kubenswrapper[4691]: I1202 08:52:29.569820 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" event={"ID":"fe266bbb-b647-413b-a40a-23a99dab1d3d","Type":"ContainerStarted","Data":"d3e0cc683e02fcd5e017321cb004603efc18d00c9d8bf3295a2f083f9210efbe"} Dec 02 08:52:40 crc kubenswrapper[4691]: I1202 08:52:40.563089 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:52:40 crc kubenswrapper[4691]: E1202 08:52:40.563946 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:52:52 crc kubenswrapper[4691]: I1202 08:52:52.580635 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:52:52 crc kubenswrapper[4691]: E1202 08:52:52.581877 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:53:04 crc kubenswrapper[4691]: I1202 08:53:04.562510 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:53:04 crc kubenswrapper[4691]: E1202 08:53:04.563434 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:53:07 crc kubenswrapper[4691]: I1202 08:53:07.984648 4691 generic.go:334] "Generic (PLEG): container finished" podID="fe266bbb-b647-413b-a40a-23a99dab1d3d" containerID="55b7349b45b2709c80511d201e41ee6fd13bbbfe1b359c26ed1396ca3822acee" exitCode=0 Dec 02 08:53:07 crc kubenswrapper[4691]: I1202 08:53:07.984732 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" event={"ID":"fe266bbb-b647-413b-a40a-23a99dab1d3d","Type":"ContainerDied","Data":"55b7349b45b2709c80511d201e41ee6fd13bbbfe1b359c26ed1396ca3822acee"} Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.114996 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.148985 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-qlq4z"] Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.161831 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-qlq4z"] Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.248349 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck6b9\" (UniqueName: \"kubernetes.io/projected/fe266bbb-b647-413b-a40a-23a99dab1d3d-kube-api-access-ck6b9\") pod \"fe266bbb-b647-413b-a40a-23a99dab1d3d\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.248548 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe266bbb-b647-413b-a40a-23a99dab1d3d-host\") pod \"fe266bbb-b647-413b-a40a-23a99dab1d3d\" (UID: \"fe266bbb-b647-413b-a40a-23a99dab1d3d\") " Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.248744 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe266bbb-b647-413b-a40a-23a99dab1d3d-host" (OuterVolumeSpecName: "host") pod "fe266bbb-b647-413b-a40a-23a99dab1d3d" (UID: "fe266bbb-b647-413b-a40a-23a99dab1d3d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.250138 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe266bbb-b647-413b-a40a-23a99dab1d3d-host\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.254921 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe266bbb-b647-413b-a40a-23a99dab1d3d-kube-api-access-ck6b9" (OuterVolumeSpecName: "kube-api-access-ck6b9") pod "fe266bbb-b647-413b-a40a-23a99dab1d3d" (UID: "fe266bbb-b647-413b-a40a-23a99dab1d3d"). InnerVolumeSpecName "kube-api-access-ck6b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:53:09 crc kubenswrapper[4691]: I1202 08:53:09.352526 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck6b9\" (UniqueName: \"kubernetes.io/projected/fe266bbb-b647-413b-a40a-23a99dab1d3d-kube-api-access-ck6b9\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.006537 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3e0cc683e02fcd5e017321cb004603efc18d00c9d8bf3295a2f083f9210efbe" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.007122 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-qlq4z" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.374646 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-mnmr6"] Dec 02 08:53:10 crc kubenswrapper[4691]: E1202 08:53:10.376622 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe266bbb-b647-413b-a40a-23a99dab1d3d" containerName="container-00" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.376740 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe266bbb-b647-413b-a40a-23a99dab1d3d" containerName="container-00" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.377057 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe266bbb-b647-413b-a40a-23a99dab1d3d" containerName="container-00" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.378130 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.477449 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e70e8a19-ebaa-40fd-88d8-1319c582f414-host\") pod \"crc-debug-mnmr6\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.477531 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thsnx\" (UniqueName: \"kubernetes.io/projected/e70e8a19-ebaa-40fd-88d8-1319c582f414-kube-api-access-thsnx\") pod \"crc-debug-mnmr6\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.579006 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e70e8a19-ebaa-40fd-88d8-1319c582f414-host\") pod \"crc-debug-mnmr6\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.579066 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thsnx\" (UniqueName: \"kubernetes.io/projected/e70e8a19-ebaa-40fd-88d8-1319c582f414-kube-api-access-thsnx\") pod \"crc-debug-mnmr6\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.579482 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e70e8a19-ebaa-40fd-88d8-1319c582f414-host\") pod \"crc-debug-mnmr6\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.579823 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe266bbb-b647-413b-a40a-23a99dab1d3d" path="/var/lib/kubelet/pods/fe266bbb-b647-413b-a40a-23a99dab1d3d/volumes" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.610255 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thsnx\" (UniqueName: \"kubernetes.io/projected/e70e8a19-ebaa-40fd-88d8-1319c582f414-kube-api-access-thsnx\") pod \"crc-debug-mnmr6\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:10 crc kubenswrapper[4691]: I1202 08:53:10.702443 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:11 crc kubenswrapper[4691]: I1202 08:53:11.022038 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" event={"ID":"e70e8a19-ebaa-40fd-88d8-1319c582f414","Type":"ContainerStarted","Data":"0211c7f1354a97d0a0208ddd6b834c8549dd924ab6718c21d756dc039208be75"} Dec 02 08:53:11 crc kubenswrapper[4691]: I1202 08:53:11.022475 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" event={"ID":"e70e8a19-ebaa-40fd-88d8-1319c582f414","Type":"ContainerStarted","Data":"a74e3a87cba4c4285b8422b6e47a8b41fc06d7c60697e8f78ba12ef57b9731b5"} Dec 02 08:53:11 crc kubenswrapper[4691]: I1202 08:53:11.043796 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" podStartSLOduration=1.043748798 podStartE2EDuration="1.043748798s" podCreationTimestamp="2025-12-02 08:53:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 08:53:11.037712336 +0000 UTC m=+4038.821791208" watchObservedRunningTime="2025-12-02 08:53:11.043748798 +0000 UTC m=+4038.827827660" Dec 02 08:53:12 crc kubenswrapper[4691]: I1202 08:53:12.035647 4691 generic.go:334] "Generic (PLEG): container finished" podID="e70e8a19-ebaa-40fd-88d8-1319c582f414" containerID="0211c7f1354a97d0a0208ddd6b834c8549dd924ab6718c21d756dc039208be75" exitCode=0 Dec 02 08:53:12 crc kubenswrapper[4691]: I1202 08:53:12.035699 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" event={"ID":"e70e8a19-ebaa-40fd-88d8-1319c582f414","Type":"ContainerDied","Data":"0211c7f1354a97d0a0208ddd6b834c8549dd924ab6718c21d756dc039208be75"} Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.159495 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.197715 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-mnmr6"] Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.245092 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-mnmr6"] Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.336921 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thsnx\" (UniqueName: \"kubernetes.io/projected/e70e8a19-ebaa-40fd-88d8-1319c582f414-kube-api-access-thsnx\") pod \"e70e8a19-ebaa-40fd-88d8-1319c582f414\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.337076 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e70e8a19-ebaa-40fd-88d8-1319c582f414-host\") pod \"e70e8a19-ebaa-40fd-88d8-1319c582f414\" (UID: \"e70e8a19-ebaa-40fd-88d8-1319c582f414\") " Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.337631 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e70e8a19-ebaa-40fd-88d8-1319c582f414-host" (OuterVolumeSpecName: "host") pod "e70e8a19-ebaa-40fd-88d8-1319c582f414" (UID: "e70e8a19-ebaa-40fd-88d8-1319c582f414"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.345322 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70e8a19-ebaa-40fd-88d8-1319c582f414-kube-api-access-thsnx" (OuterVolumeSpecName: "kube-api-access-thsnx") pod "e70e8a19-ebaa-40fd-88d8-1319c582f414" (UID: "e70e8a19-ebaa-40fd-88d8-1319c582f414"). InnerVolumeSpecName "kube-api-access-thsnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.439260 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thsnx\" (UniqueName: \"kubernetes.io/projected/e70e8a19-ebaa-40fd-88d8-1319c582f414-kube-api-access-thsnx\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:13 crc kubenswrapper[4691]: I1202 08:53:13.439300 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e70e8a19-ebaa-40fd-88d8-1319c582f414-host\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.058651 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74e3a87cba4c4285b8422b6e47a8b41fc06d7c60697e8f78ba12ef57b9731b5" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.058697 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-mnmr6" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.391168 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-thqjf"] Dec 02 08:53:14 crc kubenswrapper[4691]: E1202 08:53:14.392099 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70e8a19-ebaa-40fd-88d8-1319c582f414" containerName="container-00" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.392287 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70e8a19-ebaa-40fd-88d8-1319c582f414" containerName="container-00" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.392500 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70e8a19-ebaa-40fd-88d8-1319c582f414" containerName="container-00" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.393445 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.562688 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0e4ced-5fed-422f-bb29-9793bac53318-host\") pod \"crc-debug-thqjf\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.562808 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zckg\" (UniqueName: \"kubernetes.io/projected/fd0e4ced-5fed-422f-bb29-9793bac53318-kube-api-access-9zckg\") pod \"crc-debug-thqjf\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.579692 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70e8a19-ebaa-40fd-88d8-1319c582f414" path="/var/lib/kubelet/pods/e70e8a19-ebaa-40fd-88d8-1319c582f414/volumes" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.665290 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0e4ced-5fed-422f-bb29-9793bac53318-host\") pod \"crc-debug-thqjf\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.665393 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zckg\" (UniqueName: \"kubernetes.io/projected/fd0e4ced-5fed-422f-bb29-9793bac53318-kube-api-access-9zckg\") pod \"crc-debug-thqjf\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.665499 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0e4ced-5fed-422f-bb29-9793bac53318-host\") pod \"crc-debug-thqjf\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.689855 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zckg\" (UniqueName: \"kubernetes.io/projected/fd0e4ced-5fed-422f-bb29-9793bac53318-kube-api-access-9zckg\") pod \"crc-debug-thqjf\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:14 crc kubenswrapper[4691]: I1202 08:53:14.725217 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:15 crc kubenswrapper[4691]: I1202 08:53:15.070162 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-thqjf" event={"ID":"fd0e4ced-5fed-422f-bb29-9793bac53318","Type":"ContainerStarted","Data":"c9207aa20d7959fc3dd589a425ab466cd15d535a65e636dd50c29e9dd95055e6"} Dec 02 08:53:16 crc kubenswrapper[4691]: I1202 08:53:16.084192 4691 generic.go:334] "Generic (PLEG): container finished" podID="fd0e4ced-5fed-422f-bb29-9793bac53318" containerID="eb7f485cbd7c7fbee58dd9b453b277adfe9258f07ee7a77c6908045d605d4891" exitCode=0 Dec 02 08:53:16 crc kubenswrapper[4691]: I1202 08:53:16.084296 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/crc-debug-thqjf" event={"ID":"fd0e4ced-5fed-422f-bb29-9793bac53318","Type":"ContainerDied","Data":"eb7f485cbd7c7fbee58dd9b453b277adfe9258f07ee7a77c6908045d605d4891"} Dec 02 08:53:16 crc kubenswrapper[4691]: I1202 08:53:16.132626 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-thqjf"] Dec 02 08:53:16 crc kubenswrapper[4691]: I1202 08:53:16.142146 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bxqm7/crc-debug-thqjf"] Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.204669 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.321355 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zckg\" (UniqueName: \"kubernetes.io/projected/fd0e4ced-5fed-422f-bb29-9793bac53318-kube-api-access-9zckg\") pod \"fd0e4ced-5fed-422f-bb29-9793bac53318\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.321447 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0e4ced-5fed-422f-bb29-9793bac53318-host\") pod \"fd0e4ced-5fed-422f-bb29-9793bac53318\" (UID: \"fd0e4ced-5fed-422f-bb29-9793bac53318\") " Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.321587 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd0e4ced-5fed-422f-bb29-9793bac53318-host" (OuterVolumeSpecName: "host") pod "fd0e4ced-5fed-422f-bb29-9793bac53318" (UID: "fd0e4ced-5fed-422f-bb29-9793bac53318"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.321989 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fd0e4ced-5fed-422f-bb29-9793bac53318-host\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.328068 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0e4ced-5fed-422f-bb29-9793bac53318-kube-api-access-9zckg" (OuterVolumeSpecName: "kube-api-access-9zckg") pod "fd0e4ced-5fed-422f-bb29-9793bac53318" (UID: "fd0e4ced-5fed-422f-bb29-9793bac53318"). InnerVolumeSpecName "kube-api-access-9zckg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:53:17 crc kubenswrapper[4691]: I1202 08:53:17.424266 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zckg\" (UniqueName: \"kubernetes.io/projected/fd0e4ced-5fed-422f-bb29-9793bac53318-kube-api-access-9zckg\") on node \"crc\" DevicePath \"\"" Dec 02 08:53:18 crc kubenswrapper[4691]: I1202 08:53:18.111555 4691 scope.go:117] "RemoveContainer" containerID="eb7f485cbd7c7fbee58dd9b453b277adfe9258f07ee7a77c6908045d605d4891" Dec 02 08:53:18 crc kubenswrapper[4691]: I1202 08:53:18.111600 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/crc-debug-thqjf" Dec 02 08:53:18 crc kubenswrapper[4691]: I1202 08:53:18.562305 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:53:18 crc kubenswrapper[4691]: E1202 08:53:18.562708 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:53:18 crc kubenswrapper[4691]: I1202 08:53:18.575470 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0e4ced-5fed-422f-bb29-9793bac53318" path="/var/lib/kubelet/pods/fd0e4ced-5fed-422f-bb29-9793bac53318/volumes" Dec 02 08:53:29 crc kubenswrapper[4691]: I1202 08:53:29.561738 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:53:29 crc kubenswrapper[4691]: E1202 08:53:29.562740 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:53:37 crc kubenswrapper[4691]: I1202 08:53:37.604975 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86bc664f5b-6lklq_ff11f2ca-96b6-4cd2-85b8-88916b74efc7/barbican-api/0.log" Dec 02 08:53:37 crc kubenswrapper[4691]: I1202 08:53:37.771690 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-86bc664f5b-6lklq_ff11f2ca-96b6-4cd2-85b8-88916b74efc7/barbican-api-log/0.log" Dec 02 08:53:37 crc kubenswrapper[4691]: I1202 08:53:37.853138 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-598b69485d-n2fl7_d855ba5e-92ab-4a5e-b613-f49c9fec44b1/barbican-keystone-listener/0.log" Dec 02 08:53:37 crc kubenswrapper[4691]: I1202 08:53:37.993940 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-598b69485d-n2fl7_d855ba5e-92ab-4a5e-b613-f49c9fec44b1/barbican-keystone-listener-log/0.log" Dec 02 08:53:38 crc kubenswrapper[4691]: I1202 08:53:38.026717 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57597556c5-xzp56_3c13685c-ff97-4074-8bc8-5659d16ec95d/barbican-worker/0.log" Dec 02 08:53:38 crc kubenswrapper[4691]: I1202 08:53:38.092838 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-57597556c5-xzp56_3c13685c-ff97-4074-8bc8-5659d16ec95d/barbican-worker-log/0.log" Dec 02 08:53:38 crc kubenswrapper[4691]: I1202 08:53:38.231676 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ws572_6eabed67-587a-402c-8d6f-02163a229356/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:38 crc kubenswrapper[4691]: I1202 08:53:38.354984 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/ceilometer-central-agent/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.085292 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/sg-core/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.104868 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/proxy-httpd/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.144332 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_79ed78c7-a8cc-4ad0-a0cc-38c0f226df93/ceilometer-notification-agent/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.303170 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0e3d72e5-726a-4f4b-a677-6237021e8747/cinder-api-log/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.369589 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_0e3d72e5-726a-4f4b-a677-6237021e8747/cinder-api/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.423386 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd7d9765-5aa9-4f8d-af36-53dfbba7da81/cinder-scheduler/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.549535 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_cd7d9765-5aa9-4f8d-af36-53dfbba7da81/probe/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.661051 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-gfqdq_9c3bebb2-7f42-4553-83b6-7fafbb022c70/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.763185 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-lvg66_4fde2bba-1e5a-47a2-a918-8e57f11e6d95/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:39 crc kubenswrapper[4691]: I1202 08:53:39.879571 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-g4pgk_fa0cb344-97e1-42ae-867b-30322564459d/init/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.079858 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-g4pgk_fa0cb344-97e1-42ae-867b-30322564459d/init/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.117562 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-tjlhh_90715156-30f9-4dfc-9c78-374f0a07bb4c/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.121355 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-g4pgk_fa0cb344-97e1-42ae-867b-30322564459d/dnsmasq-dns/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.338426 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_23424898-747d-4eef-8f7e-ee64e1bf1070/glance-log/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.359272 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_23424898-747d-4eef-8f7e-ee64e1bf1070/glance-httpd/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.527912 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0837fb6a-ad2a-4110-bec4-727f9daa999c/glance-httpd/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.567661 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_0837fb6a-ad2a-4110-bec4-727f9daa999c/glance-log/0.log" Dec 02 08:53:40 crc kubenswrapper[4691]: I1202 08:53:40.762681 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6585c7db4b-jz894_76022e0c-2dd2-4395-8607-aa13da42f557/horizon/0.log" Dec 02 08:53:41 crc kubenswrapper[4691]: I1202 08:53:41.202776 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-nvtpt_80054e7f-3448-487b-8f0e-fc5eda159e57/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:41 crc kubenswrapper[4691]: I1202 08:53:41.506649 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-lx2xx_c4804dc1-5ac2-422e-87fe-71120becde69/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:41 crc kubenswrapper[4691]: I1202 08:53:41.735583 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6585c7db4b-jz894_76022e0c-2dd2-4395-8607-aa13da42f557/horizon-log/0.log" Dec 02 08:53:41 crc kubenswrapper[4691]: I1202 08:53:41.738123 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6960ef77-d277-4be6-be89-446664dd7775/kube-state-metrics/0.log" Dec 02 08:53:41 crc kubenswrapper[4691]: I1202 08:53:41.782465 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-55c7cfcf8b-8rs5b_f16ad552-e2f2-4d96-b9e8-f0a8d5e3b6dd/keystone-api/0.log" Dec 02 08:53:41 crc kubenswrapper[4691]: I1202 08:53:41.819095 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-z4wnb_d1c6d92a-1daf-4554-822b-1c946124e1d0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:42 crc kubenswrapper[4691]: I1202 08:53:42.146277 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95567dd97-rcpxr_22255ebb-1831-4d2e-966b-1ae2fee83ebf/neutron-httpd/0.log" Dec 02 08:53:42 crc kubenswrapper[4691]: I1202 08:53:42.215288 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-95567dd97-rcpxr_22255ebb-1831-4d2e-966b-1ae2fee83ebf/neutron-api/0.log" Dec 02 08:53:42 crc kubenswrapper[4691]: I1202 08:53:42.377545 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-mvrxp_c64d5e17-b659-47c6-aa5b-a62be849ee69/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:42 crc kubenswrapper[4691]: I1202 08:53:42.927994 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a12282a4-2fdb-4627-b2ff-06dbde0d2fdb/nova-api-log/0.log" Dec 02 08:53:42 crc kubenswrapper[4691]: I1202 08:53:42.981660 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_95c80e6a-fc0a-4b4b-a34f-b5ed6f2ec437/nova-cell0-conductor-conductor/0.log" Dec 02 08:53:43 crc kubenswrapper[4691]: I1202 08:53:43.356230 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_a12282a4-2fdb-4627-b2ff-06dbde0d2fdb/nova-api-api/0.log" Dec 02 08:53:43 crc kubenswrapper[4691]: I1202 08:53:43.372386 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_e057af5a-bcd5-4612-9c03-350147146c52/nova-cell1-conductor-conductor/0.log" Dec 02 08:53:43 crc kubenswrapper[4691]: I1202 08:53:43.557900 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0dedf0d3-f3c7-4cb1-9003-8ac588994c43/nova-cell1-novncproxy-novncproxy/0.log" Dec 02 08:53:43 crc kubenswrapper[4691]: I1202 08:53:43.564777 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:53:43 crc kubenswrapper[4691]: E1202 08:53:43.565042 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:53:43 crc kubenswrapper[4691]: I1202 08:53:43.645732 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-vf7w6_baf13f0c-0ba4-4e4f-95cb-2de2f510801e/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:43 crc kubenswrapper[4691]: I1202 08:53:43.845905 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cd1a81bc-6c1f-4caa-917a-900c527f0df5/nova-metadata-log/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.154637 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_6ff86164-f22f-49e6-8933-e599da966506/nova-scheduler-scheduler/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.164915 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f14bc2d4-ce0c-440d-9e1d-15b0b8716562/mysql-bootstrap/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.302818 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f14bc2d4-ce0c-440d-9e1d-15b0b8716562/mysql-bootstrap/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.379831 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_f14bc2d4-ce0c-440d-9e1d-15b0b8716562/galera/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.535082 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aa4f9395-a46a-40e4-a80c-c9b43caadc0b/mysql-bootstrap/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.754468 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aa4f9395-a46a-40e4-a80c-c9b43caadc0b/galera/0.log" Dec 02 08:53:44 crc kubenswrapper[4691]: I1202 08:53:44.976742 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_8fc95136-c6f1-4a81-ae0f-0fb7c59c9ab7/openstackclient/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.169496 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-9jwww_ed3b8ff0-c19d-4614-abe4-0ad6b5801b78/ovn-controller/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.303239 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_cd1a81bc-6c1f-4caa-917a-900c527f0df5/nova-metadata-metadata/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.357040 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-mjcrr_0cdf429e-b93d-4009-aaa1-1c45a0083363/openstack-network-exporter/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.534056 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovsdb-server-init/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.619207 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_aa4f9395-a46a-40e4-a80c-c9b43caadc0b/mysql-bootstrap/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.785422 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovsdb-server/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.785661 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovsdb-server-init/0.log" Dec 02 08:53:45 crc kubenswrapper[4691]: I1202 08:53:45.812412 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-twpkf_41e5a6f8-bc4a-43d6-b49b-d065f6cef159/ovs-vswitchd/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.085214 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-tkz82_87c36120-368b-47ab-baff-e007b39fc1d0/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.086110 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23291b10-f1ed-4d19-9689-62bdf530e28e/openstack-network-exporter/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.175541 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_23291b10-f1ed-4d19-9689-62bdf530e28e/ovn-northd/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.295504 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4edb0266-7a9e-4e28-810c-7136d8336f1b/ovsdbserver-nb/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.308850 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4edb0266-7a9e-4e28-810c-7136d8336f1b/openstack-network-exporter/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.536673 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9dfe73d8-e7c6-4906-bb6c-64c13435c53f/openstack-network-exporter/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.587533 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_9dfe73d8-e7c6-4906-bb6c-64c13435c53f/ovsdbserver-sb/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.803015 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-99bc7c96-6nbmb_1fd0d9c6-1443-4198-8319-642b450eecb8/placement-api/0.log" Dec 02 08:53:46 crc kubenswrapper[4691]: I1202 08:53:46.859590 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-99bc7c96-6nbmb_1fd0d9c6-1443-4198-8319-642b450eecb8/placement-log/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.000094 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0573471b-7d3a-484d-9195-87918928a753/setup-container/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.191514 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0573471b-7d3a-484d-9195-87918928a753/setup-container/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.203963 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_0573471b-7d3a-484d-9195-87918928a753/rabbitmq/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.264704 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_178767a6-fba0-4c85-ab0c-0a3a1ffcc627/setup-container/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.459003 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_178767a6-fba0-4c85-ab0c-0a3a1ffcc627/rabbitmq/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.473800 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_178767a6-fba0-4c85-ab0c-0a3a1ffcc627/setup-container/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.537402 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-zbckx_31d7c220-1ece-46e7-bbe3-1737890c15e0/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.666243 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-c7mwj_34cc12b6-9f55-450f-b073-0e89d0889946/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.746227 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-gsxfr_2a8c0a05-f1a6-4a5e-9598-9146f0074dc1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:47 crc kubenswrapper[4691]: I1202 08:53:47.990564 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-nq6h7_1607b522-c05f-4f86-b8cb-79caa03799ed/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:48 crc kubenswrapper[4691]: I1202 08:53:48.055945 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-fsbvf_bfc83744-d9e3-4520-96ea-2ce6e382af39/ssh-known-hosts-edpm-deployment/0.log" Dec 02 08:53:48 crc kubenswrapper[4691]: I1202 08:53:48.853417 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76d578d5f5-hmcbw_de7d695d-6d9a-4de2-830e-579f9d496f08/proxy-server/0.log" Dec 02 08:53:48 crc kubenswrapper[4691]: I1202 08:53:48.878211 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-lchdp_82672deb-2527-4d18-8006-0f794dfe97c0/swift-ring-rebalance/0.log" Dec 02 08:53:48 crc kubenswrapper[4691]: I1202 08:53:48.956618 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-76d578d5f5-hmcbw_de7d695d-6d9a-4de2-830e-579f9d496f08/proxy-httpd/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.097340 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-reaper/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.151325 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-auditor/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.245746 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-replicator/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.357277 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/account-server/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.403639 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-auditor/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.487967 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-replicator/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.492145 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-server/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.619487 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/container-updater/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.680808 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-auditor/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.751051 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-expirer/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.817448 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-server/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.827035 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-replicator/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.980805 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/rsync/0.log" Dec 02 08:53:49 crc kubenswrapper[4691]: I1202 08:53:49.981789 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/object-updater/0.log" Dec 02 08:53:50 crc kubenswrapper[4691]: I1202 08:53:50.057737 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_c25f8b81-a8e1-4035-ae92-209fd4ed5ec0/swift-recon-cron/0.log" Dec 02 08:53:50 crc kubenswrapper[4691]: I1202 08:53:50.813144 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-qsbsg_a5fcdaa5-c1a6-4f23-b953-0d31524ee62f/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:50 crc kubenswrapper[4691]: I1202 08:53:50.920895 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_0d635e45-a63d-4661-9b82-b21d8ce59623/tempest-tests-tempest-tests-runner/0.log" Dec 02 08:53:51 crc kubenswrapper[4691]: I1202 08:53:51.014841 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_0e4e77b4-f638-453e-9408-61dae4d0f68a/test-operator-logs-container/0.log" Dec 02 08:53:51 crc kubenswrapper[4691]: I1202 08:53:51.185959 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-67hl6_5f7ee74e-e2c8-4144-9643-4df288709175/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 02 08:53:55 crc kubenswrapper[4691]: I1202 08:53:55.561603 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:53:55 crc kubenswrapper[4691]: E1202 08:53:55.562575 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:53:59 crc kubenswrapper[4691]: I1202 08:53:59.697944 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_b3c7c69c-4fd9-4483-89b7-202f766ce6e5/memcached/0.log" Dec 02 08:54:08 crc kubenswrapper[4691]: I1202 08:54:08.561921 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:54:08 crc kubenswrapper[4691]: E1202 08:54:08.562911 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:54:19 crc kubenswrapper[4691]: I1202 08:54:19.562701 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:54:19 crc kubenswrapper[4691]: E1202 08:54:19.563733 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:54:19 crc kubenswrapper[4691]: I1202 08:54:19.829675 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-w9hbv_5369081c-2142-4dfa-9482-b8d8d6d4195f/kube-rbac-proxy/0.log" Dec 02 08:54:19 crc kubenswrapper[4691]: I1202 08:54:19.908904 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-w9hbv_5369081c-2142-4dfa-9482-b8d8d6d4195f/manager/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.043889 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-66zbg_e88d7782-bcf8-4d40-aa1c-269533471279/kube-rbac-proxy/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.109475 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-66zbg_e88d7782-bcf8-4d40-aa1c-269533471279/manager/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.256057 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-c7ftc_9e639d67-2200-474e-9be7-55bef7c97fe6/kube-rbac-proxy/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.259000 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-c7ftc_9e639d67-2200-474e-9be7-55bef7c97fe6/manager/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.376401 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/util/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.543701 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/pull/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.557579 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/util/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.605631 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/pull/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.760604 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/pull/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.772382 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/util/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.807320 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_eb0b332f356ac10a12e312c3686a0550de3f9a1d622c92ddaaf0817127qwt6h_337754ee-e2dc-4a26-84f4-6010c0f73133/extract/0.log" Dec 02 08:54:20 crc kubenswrapper[4691]: I1202 08:54:20.979439 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-n5f6s_f2666d2b-c30c-40d4-bfab-0e6d00571ecc/kube-rbac-proxy/0.log" Dec 02 08:54:21 crc kubenswrapper[4691]: I1202 08:54:21.726390 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qltbw_0361226c-435b-4221-b59d-74900b2552e1/kube-rbac-proxy/0.log" Dec 02 08:54:21 crc kubenswrapper[4691]: I1202 08:54:21.740422 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-qltbw_0361226c-435b-4221-b59d-74900b2552e1/manager/0.log" Dec 02 08:54:21 crc kubenswrapper[4691]: I1202 08:54:21.761058 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-n5f6s_f2666d2b-c30c-40d4-bfab-0e6d00571ecc/manager/0.log" Dec 02 08:54:21 crc kubenswrapper[4691]: I1202 08:54:21.985889 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-d898r_c92e2d03-8848-432f-82f4-fd28b3b0fa34/manager/0.log" Dec 02 08:54:21 crc kubenswrapper[4691]: I1202 08:54:21.988343 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-d898r_c92e2d03-8848-432f-82f4-fd28b3b0fa34/kube-rbac-proxy/0.log" Dec 02 08:54:22 crc kubenswrapper[4691]: I1202 08:54:22.186943 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-n5t7p_b930ff47-307d-47b3-9b84-54e5860ee2db/kube-rbac-proxy/0.log" Dec 02 08:54:22 crc kubenswrapper[4691]: I1202 08:54:22.335340 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-p2dmm_a1687ef3-9bf5-451e-aa8a-22ede53d9ed9/kube-rbac-proxy/0.log" Dec 02 08:54:22 crc kubenswrapper[4691]: I1202 08:54:22.419835 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-n5t7p_b930ff47-307d-47b3-9b84-54e5860ee2db/manager/0.log" Dec 02 08:54:22 crc kubenswrapper[4691]: I1202 08:54:22.775103 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-p2dmm_a1687ef3-9bf5-451e-aa8a-22ede53d9ed9/manager/0.log" Dec 02 08:54:22 crc kubenswrapper[4691]: I1202 08:54:22.885724 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-44twc_759a905a-dc61-4206-862f-cb8b6f85882f/kube-rbac-proxy/0.log" Dec 02 08:54:22 crc kubenswrapper[4691]: I1202 08:54:22.984029 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-44twc_759a905a-dc61-4206-862f-cb8b6f85882f/manager/0.log" Dec 02 08:54:23 crc kubenswrapper[4691]: I1202 08:54:23.033429 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-28bs2_36d938bc-e3d6-4f21-8327-5f655a4ef54a/kube-rbac-proxy/0.log" Dec 02 08:54:23 crc kubenswrapper[4691]: I1202 08:54:23.805400 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-28bs2_36d938bc-e3d6-4f21-8327-5f655a4ef54a/manager/0.log" Dec 02 08:54:23 crc kubenswrapper[4691]: I1202 08:54:23.836635 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-lhkdn_0667dbf1-e305-4d12-af7b-3d532a834609/manager/0.log" Dec 02 08:54:23 crc kubenswrapper[4691]: I1202 08:54:23.859202 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-lhkdn_0667dbf1-e305-4d12-af7b-3d532a834609/kube-rbac-proxy/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.015591 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-6tdpt_f4783498-99ba-42cc-9312-8e8c6b279e5a/kube-rbac-proxy/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.092815 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-2g56f_d82caf6e-e4a7-4474-8cd4-6d3f554ce608/kube-rbac-proxy/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.124859 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-6tdpt_f4783498-99ba-42cc-9312-8e8c6b279e5a/manager/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.254546 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-2g56f_d82caf6e-e4a7-4474-8cd4-6d3f554ce608/manager/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.326859 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9p97b_e8014abc-9e31-40d3-8e34-d595a8ef95b4/kube-rbac-proxy/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.340138 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-9p97b_e8014abc-9e31-40d3-8e34-d595a8ef95b4/manager/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.463978 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v_e6bec2f4-8aea-472b-a0f9-591b744f9fe4/kube-rbac-proxy/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.579077 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4bj98v_e6bec2f4-8aea-472b-a0f9-591b744f9fe4/manager/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.953337 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-7f9cf644cb-bqpg5_e36ff06e-9c96-433a-88fc-14c6941566ee/operator/0.log" Dec 02 08:54:24 crc kubenswrapper[4691]: I1202 08:54:24.979669 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f2vhc_22ecfc20-6eb6-417f-ab17-8fc55057d5af/registry-server/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.147928 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gbqw2_09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b/kube-rbac-proxy/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.296315 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nlwnv_fd33241e-270e-4dbe-b024-e368b2050ece/kube-rbac-proxy/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.316015 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-gbqw2_09ff76ff-10a0-4ff7-bd3a-90a0fc11bf5b/manager/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.396477 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-nlwnv_fd33241e-270e-4dbe-b024-e368b2050ece/manager/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.595686 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-2kbsh_b42f21ed-f361-4b6f-abc7-03b237501f65/operator/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.702281 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-slsfn_dfa342c0-60d5-4025-a881-30d706833e2b/kube-rbac-proxy/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.805462 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-85b7b84db-d9nql_e837878c-4a1f-463b-913c-7df163c5ba27/manager/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.853600 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-666pb_1a9a40d0-4970-4551-b68e-a9697f250e94/kube-rbac-proxy/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.876808 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-slsfn_dfa342c0-60d5-4025-a881-30d706833e2b/manager/0.log" Dec 02 08:54:25 crc kubenswrapper[4691]: I1202 08:54:25.999128 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-666pb_1a9a40d0-4970-4551-b68e-a9697f250e94/manager/0.log" Dec 02 08:54:26 crc kubenswrapper[4691]: I1202 08:54:26.098639 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wnzqd_4a080540-9871-4b8c-9d74-e3f1f3cf317c/kube-rbac-proxy/0.log" Dec 02 08:54:26 crc kubenswrapper[4691]: I1202 08:54:26.125708 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-wnzqd_4a080540-9871-4b8c-9d74-e3f1f3cf317c/manager/0.log" Dec 02 08:54:26 crc kubenswrapper[4691]: I1202 08:54:26.215226 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4k8xk_f75595fc-314d-4aa9-bc60-e82c16361768/kube-rbac-proxy/0.log" Dec 02 08:54:26 crc kubenswrapper[4691]: I1202 08:54:26.287105 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-4k8xk_f75595fc-314d-4aa9-bc60-e82c16361768/manager/0.log" Dec 02 08:54:30 crc kubenswrapper[4691]: I1202 08:54:30.561846 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:54:30 crc kubenswrapper[4691]: E1202 08:54:30.562752 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.847530 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xm8ns"] Dec 02 08:54:36 crc kubenswrapper[4691]: E1202 08:54:36.848731 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0e4ced-5fed-422f-bb29-9793bac53318" containerName="container-00" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.848749 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0e4ced-5fed-422f-bb29-9793bac53318" containerName="container-00" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.848986 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0e4ced-5fed-422f-bb29-9793bac53318" containerName="container-00" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.851343 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.859419 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xm8ns"] Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.900264 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xcqs\" (UniqueName: \"kubernetes.io/projected/cac161fc-d06b-4d0d-b059-796e0592cfe7-kube-api-access-4xcqs\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.900349 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-utilities\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:36 crc kubenswrapper[4691]: I1202 08:54:36.900538 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-catalog-content\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.002920 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-catalog-content\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.003214 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xcqs\" (UniqueName: \"kubernetes.io/projected/cac161fc-d06b-4d0d-b059-796e0592cfe7-kube-api-access-4xcqs\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.003252 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-utilities\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.003912 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-utilities\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.004291 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-catalog-content\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.027357 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xcqs\" (UniqueName: \"kubernetes.io/projected/cac161fc-d06b-4d0d-b059-796e0592cfe7-kube-api-access-4xcqs\") pod \"community-operators-xm8ns\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.178385 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.780711 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xm8ns"] Dec 02 08:54:37 crc kubenswrapper[4691]: I1202 08:54:37.950734 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerStarted","Data":"3930bbef00e5f7613b2f5e541581c8cf4dc4c9680330d5d3f7337e51f50531af"} Dec 02 08:54:38 crc kubenswrapper[4691]: I1202 08:54:38.963242 4691 generic.go:334] "Generic (PLEG): container finished" podID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerID="c6bba29da777ceb8004929dbd531a7e6b7e41bc2e09ae1aa47e268012d0f1980" exitCode=0 Dec 02 08:54:38 crc kubenswrapper[4691]: I1202 08:54:38.963482 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerDied","Data":"c6bba29da777ceb8004929dbd531a7e6b7e41bc2e09ae1aa47e268012d0f1980"} Dec 02 08:54:38 crc kubenswrapper[4691]: I1202 08:54:38.966740 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 02 08:54:39 crc kubenswrapper[4691]: I1202 08:54:39.975794 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerStarted","Data":"73d7558aedc2a5f4c8f4d239933dc6bc47e39d4a2cdd0891397f55ff8962817f"} Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.230345 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2p5df"] Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.232362 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.248911 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2p5df"] Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.316638 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-catalog-content\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.316780 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88pkk\" (UniqueName: \"kubernetes.io/projected/9b853317-baf3-4f45-b212-df3ea408de49-kube-api-access-88pkk\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.316829 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-utilities\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.418098 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88pkk\" (UniqueName: \"kubernetes.io/projected/9b853317-baf3-4f45-b212-df3ea408de49-kube-api-access-88pkk\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.418180 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-utilities\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.418273 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-catalog-content\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.418774 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-catalog-content\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.418807 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-utilities\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.430380 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hg2vw"] Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.432820 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.448268 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hg2vw"] Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.450538 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88pkk\" (UniqueName: \"kubernetes.io/projected/9b853317-baf3-4f45-b212-df3ea408de49-kube-api-access-88pkk\") pod \"redhat-operators-2p5df\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.521884 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-utilities\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.521954 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvxn\" (UniqueName: \"kubernetes.io/projected/2c40c551-88a3-4049-b64f-5874d246b71c-kube-api-access-jzvxn\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.521980 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-catalog-content\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.571130 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.623531 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvxn\" (UniqueName: \"kubernetes.io/projected/2c40c551-88a3-4049-b64f-5874d246b71c-kube-api-access-jzvxn\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.624219 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-catalog-content\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.624781 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-utilities\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.625329 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-utilities\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.625925 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-catalog-content\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.647197 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvxn\" (UniqueName: \"kubernetes.io/projected/2c40c551-88a3-4049-b64f-5874d246b71c-kube-api-access-jzvxn\") pod \"certified-operators-hg2vw\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.749737 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:40 crc kubenswrapper[4691]: I1202 08:54:40.987682 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2p5df"] Dec 02 08:54:41 crc kubenswrapper[4691]: I1202 08:54:41.006178 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerStarted","Data":"151b7577e503cd24d95adc503f46aa225e2c16c03816e3488b1894f3a3575ed7"} Dec 02 08:54:41 crc kubenswrapper[4691]: I1202 08:54:41.023126 4691 generic.go:334] "Generic (PLEG): container finished" podID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerID="73d7558aedc2a5f4c8f4d239933dc6bc47e39d4a2cdd0891397f55ff8962817f" exitCode=0 Dec 02 08:54:41 crc kubenswrapper[4691]: I1202 08:54:41.023181 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerDied","Data":"73d7558aedc2a5f4c8f4d239933dc6bc47e39d4a2cdd0891397f55ff8962817f"} Dec 02 08:54:41 crc kubenswrapper[4691]: W1202 08:54:41.458999 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c40c551_88a3_4049_b64f_5874d246b71c.slice/crio-ce431d291a6275081e0cb9e1e23a1a851f968e43196a58afe68e394e13de6e6a WatchSource:0}: Error finding container ce431d291a6275081e0cb9e1e23a1a851f968e43196a58afe68e394e13de6e6a: Status 404 returned error can't find the container with id ce431d291a6275081e0cb9e1e23a1a851f968e43196a58afe68e394e13de6e6a Dec 02 08:54:41 crc kubenswrapper[4691]: I1202 08:54:41.463416 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hg2vw"] Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.035253 4691 generic.go:334] "Generic (PLEG): container finished" podID="2c40c551-88a3-4049-b64f-5874d246b71c" containerID="3821cfbf5bcbb8849367922ee4d0985d64ec718a8f09a001892ccc8178a54e15" exitCode=0 Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.035334 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerDied","Data":"3821cfbf5bcbb8849367922ee4d0985d64ec718a8f09a001892ccc8178a54e15"} Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.035718 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerStarted","Data":"ce431d291a6275081e0cb9e1e23a1a851f968e43196a58afe68e394e13de6e6a"} Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.037723 4691 generic.go:334] "Generic (PLEG): container finished" podID="9b853317-baf3-4f45-b212-df3ea408de49" containerID="afb22a233515b1838c6e3acc9a94a5b7b56aaa3f08a04f93fe48fee736e67a31" exitCode=0 Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.037786 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerDied","Data":"afb22a233515b1838c6e3acc9a94a5b7b56aaa3f08a04f93fe48fee736e67a31"} Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.041428 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerStarted","Data":"165badfadddecc254f5e7f0ebbff5313be783f32467f44438a1f28246f0f242f"} Dec 02 08:54:42 crc kubenswrapper[4691]: I1202 08:54:42.097577 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xm8ns" podStartSLOduration=3.494359331 podStartE2EDuration="6.097550727s" podCreationTimestamp="2025-12-02 08:54:36 +0000 UTC" firstStartedPulling="2025-12-02 08:54:38.966449534 +0000 UTC m=+4126.750528396" lastFinishedPulling="2025-12-02 08:54:41.56964093 +0000 UTC m=+4129.353719792" observedRunningTime="2025-12-02 08:54:42.086780446 +0000 UTC m=+4129.870859318" watchObservedRunningTime="2025-12-02 08:54:42.097550727 +0000 UTC m=+4129.881629609" Dec 02 08:54:44 crc kubenswrapper[4691]: I1202 08:54:44.060190 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerStarted","Data":"c92423e87262c96339e6712aa3fb2cd75e2c4c3a5b41076f03291b6aba59cff5"} Dec 02 08:54:44 crc kubenswrapper[4691]: I1202 08:54:44.061753 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerStarted","Data":"5d551ea161e63146af5330b7851040b400e038fad3619640b3814a9d3be62269"} Dec 02 08:54:44 crc kubenswrapper[4691]: I1202 08:54:44.562284 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:54:44 crc kubenswrapper[4691]: E1202 08:54:44.562629 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-mgbt6_openshift-machine-config-operator(82103e10-1127-4a84-b5fc-9d0d6a259932)\"" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" Dec 02 08:54:45 crc kubenswrapper[4691]: I1202 08:54:45.072401 4691 generic.go:334] "Generic (PLEG): container finished" podID="2c40c551-88a3-4049-b64f-5874d246b71c" containerID="c92423e87262c96339e6712aa3fb2cd75e2c4c3a5b41076f03291b6aba59cff5" exitCode=0 Dec 02 08:54:45 crc kubenswrapper[4691]: I1202 08:54:45.072449 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerDied","Data":"c92423e87262c96339e6712aa3fb2cd75e2c4c3a5b41076f03291b6aba59cff5"} Dec 02 08:54:45 crc kubenswrapper[4691]: I1202 08:54:45.074993 4691 generic.go:334] "Generic (PLEG): container finished" podID="9b853317-baf3-4f45-b212-df3ea408de49" containerID="5d551ea161e63146af5330b7851040b400e038fad3619640b3814a9d3be62269" exitCode=0 Dec 02 08:54:45 crc kubenswrapper[4691]: I1202 08:54:45.075038 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerDied","Data":"5d551ea161e63146af5330b7851040b400e038fad3619640b3814a9d3be62269"} Dec 02 08:54:46 crc kubenswrapper[4691]: I1202 08:54:46.088805 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerStarted","Data":"d5a7b33139dae005a852ff788e2e7050cc6aa8160ee10f3f1c100d433804973f"} Dec 02 08:54:46 crc kubenswrapper[4691]: I1202 08:54:46.091913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerStarted","Data":"e4b6b9fa3d5f8f19426f452bb985a71408c917a89dd82dada20c551ea93d6904"} Dec 02 08:54:46 crc kubenswrapper[4691]: I1202 08:54:46.112835 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2p5df" podStartSLOduration=2.575791972 podStartE2EDuration="6.112807482s" podCreationTimestamp="2025-12-02 08:54:40 +0000 UTC" firstStartedPulling="2025-12-02 08:54:42.039084516 +0000 UTC m=+4129.823163388" lastFinishedPulling="2025-12-02 08:54:45.576100036 +0000 UTC m=+4133.360178898" observedRunningTime="2025-12-02 08:54:46.105703474 +0000 UTC m=+4133.889782346" watchObservedRunningTime="2025-12-02 08:54:46.112807482 +0000 UTC m=+4133.896886354" Dec 02 08:54:46 crc kubenswrapper[4691]: I1202 08:54:46.149913 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hg2vw" podStartSLOduration=2.562719135 podStartE2EDuration="6.149886905s" podCreationTimestamp="2025-12-02 08:54:40 +0000 UTC" firstStartedPulling="2025-12-02 08:54:42.038463931 +0000 UTC m=+4129.822542793" lastFinishedPulling="2025-12-02 08:54:45.625631701 +0000 UTC m=+4133.409710563" observedRunningTime="2025-12-02 08:54:46.147455633 +0000 UTC m=+4133.931534525" watchObservedRunningTime="2025-12-02 08:54:46.149886905 +0000 UTC m=+4133.933965767" Dec 02 08:54:47 crc kubenswrapper[4691]: I1202 08:54:47.179398 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:47 crc kubenswrapper[4691]: I1202 08:54:47.179830 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:47 crc kubenswrapper[4691]: I1202 08:54:47.239270 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:48 crc kubenswrapper[4691]: I1202 08:54:48.161220 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.025010 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xm8ns"] Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.136236 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xm8ns" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="registry-server" containerID="cri-o://165badfadddecc254f5e7f0ebbff5313be783f32467f44438a1f28246f0f242f" gracePeriod=2 Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.573141 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.573494 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.750695 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.751048 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.803431 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:50 crc kubenswrapper[4691]: I1202 08:54:50.866817 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8x9pv_39b63660-5845-41eb-94ca-ffc8ccb34413/control-plane-machine-set-operator/0.log" Dec 02 08:54:51 crc kubenswrapper[4691]: I1202 08:54:51.078311 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qvdbg_677bf100-9036-4b58-9658-6b918304ba47/kube-rbac-proxy/0.log" Dec 02 08:54:51 crc kubenswrapper[4691]: I1202 08:54:51.086507 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qvdbg_677bf100-9036-4b58-9658-6b918304ba47/machine-api-operator/0.log" Dec 02 08:54:51 crc kubenswrapper[4691]: I1202 08:54:51.193508 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:51 crc kubenswrapper[4691]: I1202 08:54:51.623304 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2p5df" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="registry-server" probeResult="failure" output=< Dec 02 08:54:51 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Dec 02 08:54:51 crc kubenswrapper[4691]: > Dec 02 08:54:52 crc kubenswrapper[4691]: I1202 08:54:52.425260 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hg2vw"] Dec 02 08:54:54 crc kubenswrapper[4691]: I1202 08:54:54.186243 4691 generic.go:334] "Generic (PLEG): container finished" podID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerID="165badfadddecc254f5e7f0ebbff5313be783f32467f44438a1f28246f0f242f" exitCode=0 Dec 02 08:54:54 crc kubenswrapper[4691]: I1202 08:54:54.186321 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerDied","Data":"165badfadddecc254f5e7f0ebbff5313be783f32467f44438a1f28246f0f242f"} Dec 02 08:54:54 crc kubenswrapper[4691]: I1202 08:54:54.186983 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hg2vw" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="registry-server" containerID="cri-o://e4b6b9fa3d5f8f19426f452bb985a71408c917a89dd82dada20c551ea93d6904" gracePeriod=2 Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.013612 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.148891 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xcqs\" (UniqueName: \"kubernetes.io/projected/cac161fc-d06b-4d0d-b059-796e0592cfe7-kube-api-access-4xcqs\") pod \"cac161fc-d06b-4d0d-b059-796e0592cfe7\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.149077 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-utilities\") pod \"cac161fc-d06b-4d0d-b059-796e0592cfe7\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.149123 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-catalog-content\") pod \"cac161fc-d06b-4d0d-b059-796e0592cfe7\" (UID: \"cac161fc-d06b-4d0d-b059-796e0592cfe7\") " Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.150460 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-utilities" (OuterVolumeSpecName: "utilities") pod "cac161fc-d06b-4d0d-b059-796e0592cfe7" (UID: "cac161fc-d06b-4d0d-b059-796e0592cfe7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.160118 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac161fc-d06b-4d0d-b059-796e0592cfe7-kube-api-access-4xcqs" (OuterVolumeSpecName: "kube-api-access-4xcqs") pod "cac161fc-d06b-4d0d-b059-796e0592cfe7" (UID: "cac161fc-d06b-4d0d-b059-796e0592cfe7"). InnerVolumeSpecName "kube-api-access-4xcqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.200319 4691 generic.go:334] "Generic (PLEG): container finished" podID="2c40c551-88a3-4049-b64f-5874d246b71c" containerID="e4b6b9fa3d5f8f19426f452bb985a71408c917a89dd82dada20c551ea93d6904" exitCode=0 Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.200372 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerDied","Data":"e4b6b9fa3d5f8f19426f452bb985a71408c917a89dd82dada20c551ea93d6904"} Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.205137 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xm8ns" event={"ID":"cac161fc-d06b-4d0d-b059-796e0592cfe7","Type":"ContainerDied","Data":"3930bbef00e5f7613b2f5e541581c8cf4dc4c9680330d5d3f7337e51f50531af"} Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.205203 4691 scope.go:117] "RemoveContainer" containerID="165badfadddecc254f5e7f0ebbff5313be783f32467f44438a1f28246f0f242f" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.205435 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xm8ns" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.205802 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cac161fc-d06b-4d0d-b059-796e0592cfe7" (UID: "cac161fc-d06b-4d0d-b059-796e0592cfe7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.228395 4691 scope.go:117] "RemoveContainer" containerID="73d7558aedc2a5f4c8f4d239933dc6bc47e39d4a2cdd0891397f55ff8962817f" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.252308 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.252337 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac161fc-d06b-4d0d-b059-796e0592cfe7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.252348 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xcqs\" (UniqueName: \"kubernetes.io/projected/cac161fc-d06b-4d0d-b059-796e0592cfe7-kube-api-access-4xcqs\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.266954 4691 scope.go:117] "RemoveContainer" containerID="c6bba29da777ceb8004929dbd531a7e6b7e41bc2e09ae1aa47e268012d0f1980" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.570047 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xm8ns"] Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.586271 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xm8ns"] Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.637838 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.762555 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzvxn\" (UniqueName: \"kubernetes.io/projected/2c40c551-88a3-4049-b64f-5874d246b71c-kube-api-access-jzvxn\") pod \"2c40c551-88a3-4049-b64f-5874d246b71c\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.762641 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-catalog-content\") pod \"2c40c551-88a3-4049-b64f-5874d246b71c\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.762675 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-utilities\") pod \"2c40c551-88a3-4049-b64f-5874d246b71c\" (UID: \"2c40c551-88a3-4049-b64f-5874d246b71c\") " Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.764374 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-utilities" (OuterVolumeSpecName: "utilities") pod "2c40c551-88a3-4049-b64f-5874d246b71c" (UID: "2c40c551-88a3-4049-b64f-5874d246b71c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.768480 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c40c551-88a3-4049-b64f-5874d246b71c-kube-api-access-jzvxn" (OuterVolumeSpecName: "kube-api-access-jzvxn") pod "2c40c551-88a3-4049-b64f-5874d246b71c" (UID: "2c40c551-88a3-4049-b64f-5874d246b71c"). InnerVolumeSpecName "kube-api-access-jzvxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.810223 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c40c551-88a3-4049-b64f-5874d246b71c" (UID: "2c40c551-88a3-4049-b64f-5874d246b71c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.865037 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzvxn\" (UniqueName: \"kubernetes.io/projected/2c40c551-88a3-4049-b64f-5874d246b71c-kube-api-access-jzvxn\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.865069 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:55 crc kubenswrapper[4691]: I1202 08:54:55.865084 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c40c551-88a3-4049-b64f-5874d246b71c-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.225013 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hg2vw" event={"ID":"2c40c551-88a3-4049-b64f-5874d246b71c","Type":"ContainerDied","Data":"ce431d291a6275081e0cb9e1e23a1a851f968e43196a58afe68e394e13de6e6a"} Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.225071 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hg2vw" Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.225088 4691 scope.go:117] "RemoveContainer" containerID="e4b6b9fa3d5f8f19426f452bb985a71408c917a89dd82dada20c551ea93d6904" Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.249088 4691 scope.go:117] "RemoveContainer" containerID="c92423e87262c96339e6712aa3fb2cd75e2c4c3a5b41076f03291b6aba59cff5" Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.274182 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hg2vw"] Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.284149 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hg2vw"] Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.284318 4691 scope.go:117] "RemoveContainer" containerID="3821cfbf5bcbb8849367922ee4d0985d64ec718a8f09a001892ccc8178a54e15" Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.573659 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" path="/var/lib/kubelet/pods/2c40c551-88a3-4049-b64f-5874d246b71c/volumes" Dec 02 08:54:56 crc kubenswrapper[4691]: I1202 08:54:56.574479 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" path="/var/lib/kubelet/pods/cac161fc-d06b-4d0d-b059-796e0592cfe7/volumes" Dec 02 08:54:59 crc kubenswrapper[4691]: I1202 08:54:59.768715 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:55:00 crc kubenswrapper[4691]: I1202 08:55:00.268550 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"14a4380008772d3ff382b5213cbcdff86e0c2cf3ba060c707ae18d2b1fb84fae"} Dec 02 08:55:00 crc kubenswrapper[4691]: I1202 08:55:00.626877 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:55:00 crc kubenswrapper[4691]: I1202 08:55:00.684804 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:55:01 crc kubenswrapper[4691]: I1202 08:55:01.024333 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2p5df"] Dec 02 08:55:02 crc kubenswrapper[4691]: I1202 08:55:02.435433 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2p5df" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="registry-server" containerID="cri-o://d5a7b33139dae005a852ff788e2e7050cc6aa8160ee10f3f1c100d433804973f" gracePeriod=2 Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.449196 4691 generic.go:334] "Generic (PLEG): container finished" podID="9b853317-baf3-4f45-b212-df3ea408de49" containerID="d5a7b33139dae005a852ff788e2e7050cc6aa8160ee10f3f1c100d433804973f" exitCode=0 Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.449388 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerDied","Data":"d5a7b33139dae005a852ff788e2e7050cc6aa8160ee10f3f1c100d433804973f"} Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.449822 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2p5df" event={"ID":"9b853317-baf3-4f45-b212-df3ea408de49","Type":"ContainerDied","Data":"151b7577e503cd24d95adc503f46aa225e2c16c03816e3488b1894f3a3575ed7"} Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.449837 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="151b7577e503cd24d95adc503f46aa225e2c16c03816e3488b1894f3a3575ed7" Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.533927 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.645592 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-catalog-content\") pod \"9b853317-baf3-4f45-b212-df3ea408de49\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.645914 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88pkk\" (UniqueName: \"kubernetes.io/projected/9b853317-baf3-4f45-b212-df3ea408de49-kube-api-access-88pkk\") pod \"9b853317-baf3-4f45-b212-df3ea408de49\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.645943 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-utilities\") pod \"9b853317-baf3-4f45-b212-df3ea408de49\" (UID: \"9b853317-baf3-4f45-b212-df3ea408de49\") " Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.646831 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-utilities" (OuterVolumeSpecName: "utilities") pod "9b853317-baf3-4f45-b212-df3ea408de49" (UID: "9b853317-baf3-4f45-b212-df3ea408de49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:03 crc kubenswrapper[4691]: I1202 08:55:03.654943 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b853317-baf3-4f45-b212-df3ea408de49-kube-api-access-88pkk" (OuterVolumeSpecName: "kube-api-access-88pkk") pod "9b853317-baf3-4f45-b212-df3ea408de49" (UID: "9b853317-baf3-4f45-b212-df3ea408de49"). InnerVolumeSpecName "kube-api-access-88pkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.035293 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88pkk\" (UniqueName: \"kubernetes.io/projected/9b853317-baf3-4f45-b212-df3ea408de49-kube-api-access-88pkk\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.035333 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.063153 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b853317-baf3-4f45-b212-df3ea408de49" (UID: "9b853317-baf3-4f45-b212-df3ea408de49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.137527 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b853317-baf3-4f45-b212-df3ea408de49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.467537 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2p5df" Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.512812 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2p5df"] Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.538739 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2p5df"] Dec 02 08:55:04 crc kubenswrapper[4691]: I1202 08:55:04.578360 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b853317-baf3-4f45-b212-df3ea408de49" path="/var/lib/kubelet/pods/9b853317-baf3-4f45-b212-df3ea408de49/volumes" Dec 02 08:55:08 crc kubenswrapper[4691]: I1202 08:55:08.326804 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-6jm2z_d29342d1-9924-4226-ae63-6e405a469f70/cert-manager-controller/0.log" Dec 02 08:55:08 crc kubenswrapper[4691]: I1202 08:55:08.711691 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-bjslb_e8a4ac7a-1f01-4c68-b5a2-e2200f0c31de/cert-manager-cainjector/0.log" Dec 02 08:55:09 crc kubenswrapper[4691]: I1202 08:55:09.249803 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-5gp8z_ee012121-3006-49dc-9f6d-349cdbc940a1/cert-manager-webhook/0.log" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.158244 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-n7brg"] Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159413 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="extract-utilities" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159431 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="extract-utilities" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159452 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="extract-content" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159464 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="extract-content" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159486 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="extract-utilities" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159495 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="extract-utilities" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159510 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159517 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159532 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="extract-utilities" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159541 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="extract-utilities" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159552 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159559 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159579 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="extract-content" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159588 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="extract-content" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159603 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159609 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: E1202 08:55:22.159620 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="extract-content" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.159627 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="extract-content" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.160174 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac161fc-d06b-4d0d-b059-796e0592cfe7" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.160189 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b853317-baf3-4f45-b212-df3ea408de49" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.160198 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c40c551-88a3-4049-b64f-5874d246b71c" containerName="registry-server" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.161716 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.178288 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7brg"] Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.243704 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-utilities\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.243779 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-catalog-content\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.243835 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzr6z\" (UniqueName: \"kubernetes.io/projected/42e035df-9a59-41b7-b30c-29d329c6e871-kube-api-access-gzr6z\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.344739 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzr6z\" (UniqueName: \"kubernetes.io/projected/42e035df-9a59-41b7-b30c-29d329c6e871-kube-api-access-gzr6z\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.344969 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-utilities\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.345014 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-catalog-content\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.345672 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-catalog-content\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.345672 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-utilities\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.368302 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzr6z\" (UniqueName: \"kubernetes.io/projected/42e035df-9a59-41b7-b30c-29d329c6e871-kube-api-access-gzr6z\") pod \"redhat-marketplace-n7brg\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:22 crc kubenswrapper[4691]: I1202 08:55:22.499970 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.092625 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7brg"] Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.317535 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-wqdhp_e4e7cee0-91cd-406a-a496-a13b4ee91e1e/nmstate-console-plugin/0.log" Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.492540 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-7scpl_c0ce4e86-7cec-4db1-975d-51ee41f94337/nmstate-handler/0.log" Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.559908 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-f2q6n_dec2c181-9b51-4e2b-95d1-d98fa9102b3a/kube-rbac-proxy/0.log" Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.625113 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-f2q6n_dec2c181-9b51-4e2b-95d1-d98fa9102b3a/nmstate-metrics/0.log" Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.726912 4691 generic.go:334] "Generic (PLEG): container finished" podID="42e035df-9a59-41b7-b30c-29d329c6e871" containerID="e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf" exitCode=0 Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.727098 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7brg" event={"ID":"42e035df-9a59-41b7-b30c-29d329c6e871","Type":"ContainerDied","Data":"e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf"} Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.727188 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7brg" event={"ID":"42e035df-9a59-41b7-b30c-29d329c6e871","Type":"ContainerStarted","Data":"3a0b5922cb5db4de2c938d2c8c158815318ce7fc015b920025c2ecbd583dfe56"} Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.820406 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-mztlh_28cc1c9c-b36a-4783-ba53-4504f085b70d/nmstate-operator/0.log" Dec 02 08:55:23 crc kubenswrapper[4691]: I1202 08:55:23.824162 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-8slld_ef02c668-a715-48cb-8efb-9c52cdb28e9d/nmstate-webhook/0.log" Dec 02 08:55:24 crc kubenswrapper[4691]: I1202 08:55:24.740605 4691 generic.go:334] "Generic (PLEG): container finished" podID="42e035df-9a59-41b7-b30c-29d329c6e871" containerID="5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c" exitCode=0 Dec 02 08:55:24 crc kubenswrapper[4691]: I1202 08:55:24.740691 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7brg" event={"ID":"42e035df-9a59-41b7-b30c-29d329c6e871","Type":"ContainerDied","Data":"5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c"} Dec 02 08:55:25 crc kubenswrapper[4691]: I1202 08:55:25.752704 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7brg" event={"ID":"42e035df-9a59-41b7-b30c-29d329c6e871","Type":"ContainerStarted","Data":"e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748"} Dec 02 08:55:25 crc kubenswrapper[4691]: I1202 08:55:25.773073 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-n7brg" podStartSLOduration=2.261211305 podStartE2EDuration="3.773052045s" podCreationTimestamp="2025-12-02 08:55:22 +0000 UTC" firstStartedPulling="2025-12-02 08:55:23.729053373 +0000 UTC m=+4171.513132245" lastFinishedPulling="2025-12-02 08:55:25.240894123 +0000 UTC m=+4173.024972985" observedRunningTime="2025-12-02 08:55:25.770222804 +0000 UTC m=+4173.554301666" watchObservedRunningTime="2025-12-02 08:55:25.773052045 +0000 UTC m=+4173.557130907" Dec 02 08:55:32 crc kubenswrapper[4691]: I1202 08:55:32.506551 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:32 crc kubenswrapper[4691]: I1202 08:55:32.507315 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:32 crc kubenswrapper[4691]: I1202 08:55:32.577036 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:32 crc kubenswrapper[4691]: I1202 08:55:32.883287 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:32 crc kubenswrapper[4691]: I1202 08:55:32.943112 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7brg"] Dec 02 08:55:34 crc kubenswrapper[4691]: I1202 08:55:34.850987 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-n7brg" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="registry-server" containerID="cri-o://e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748" gracePeriod=2 Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.601125 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.715414 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzr6z\" (UniqueName: \"kubernetes.io/projected/42e035df-9a59-41b7-b30c-29d329c6e871-kube-api-access-gzr6z\") pod \"42e035df-9a59-41b7-b30c-29d329c6e871\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.716071 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-catalog-content\") pod \"42e035df-9a59-41b7-b30c-29d329c6e871\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.716133 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-utilities\") pod \"42e035df-9a59-41b7-b30c-29d329c6e871\" (UID: \"42e035df-9a59-41b7-b30c-29d329c6e871\") " Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.717616 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-utilities" (OuterVolumeSpecName: "utilities") pod "42e035df-9a59-41b7-b30c-29d329c6e871" (UID: "42e035df-9a59-41b7-b30c-29d329c6e871"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.747218 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e035df-9a59-41b7-b30c-29d329c6e871-kube-api-access-gzr6z" (OuterVolumeSpecName: "kube-api-access-gzr6z") pod "42e035df-9a59-41b7-b30c-29d329c6e871" (UID: "42e035df-9a59-41b7-b30c-29d329c6e871"). InnerVolumeSpecName "kube-api-access-gzr6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.753724 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "42e035df-9a59-41b7-b30c-29d329c6e871" (UID: "42e035df-9a59-41b7-b30c-29d329c6e871"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.817983 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzr6z\" (UniqueName: \"kubernetes.io/projected/42e035df-9a59-41b7-b30c-29d329c6e871-kube-api-access-gzr6z\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.818018 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.818028 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e035df-9a59-41b7-b30c-29d329c6e871-utilities\") on node \"crc\" DevicePath \"\"" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.861266 4691 generic.go:334] "Generic (PLEG): container finished" podID="42e035df-9a59-41b7-b30c-29d329c6e871" containerID="e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748" exitCode=0 Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.861309 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7brg" event={"ID":"42e035df-9a59-41b7-b30c-29d329c6e871","Type":"ContainerDied","Data":"e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748"} Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.861341 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-n7brg" event={"ID":"42e035df-9a59-41b7-b30c-29d329c6e871","Type":"ContainerDied","Data":"3a0b5922cb5db4de2c938d2c8c158815318ce7fc015b920025c2ecbd583dfe56"} Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.861335 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-n7brg" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.861356 4691 scope.go:117] "RemoveContainer" containerID="e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.882013 4691 scope.go:117] "RemoveContainer" containerID="5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.916917 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7brg"] Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.935643 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-n7brg"] Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.943424 4691 scope.go:117] "RemoveContainer" containerID="e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.966646 4691 scope.go:117] "RemoveContainer" containerID="e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748" Dec 02 08:55:35 crc kubenswrapper[4691]: E1202 08:55:35.967279 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748\": container with ID starting with e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748 not found: ID does not exist" containerID="e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.967311 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748"} err="failed to get container status \"e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748\": rpc error: code = NotFound desc = could not find container \"e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748\": container with ID starting with e773a115df30816ee9d17ba73e39e67265d99df2ed7c11e35847a975dc67a748 not found: ID does not exist" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.967333 4691 scope.go:117] "RemoveContainer" containerID="5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c" Dec 02 08:55:35 crc kubenswrapper[4691]: E1202 08:55:35.967630 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c\": container with ID starting with 5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c not found: ID does not exist" containerID="5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.967650 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c"} err="failed to get container status \"5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c\": rpc error: code = NotFound desc = could not find container \"5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c\": container with ID starting with 5658c981f60fce4dca04db4fdf95d92b0fbebc6158f38749a054151af82ded7c not found: ID does not exist" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.967662 4691 scope.go:117] "RemoveContainer" containerID="e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf" Dec 02 08:55:35 crc kubenswrapper[4691]: E1202 08:55:35.968049 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf\": container with ID starting with e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf not found: ID does not exist" containerID="e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf" Dec 02 08:55:35 crc kubenswrapper[4691]: I1202 08:55:35.968076 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf"} err="failed to get container status \"e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf\": rpc error: code = NotFound desc = could not find container \"e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf\": container with ID starting with e9870f9fd99835d04bce647bfaa6f0246e9676f9eadec6385588f40661288fcf not found: ID does not exist" Dec 02 08:55:36 crc kubenswrapper[4691]: I1202 08:55:36.576975 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" path="/var/lib/kubelet/pods/42e035df-9a59-41b7-b30c-29d329c6e871/volumes" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.470447 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-zj6r5_d016272b-85ec-410a-9392-050c9c0a5ff1/kube-rbac-proxy/0.log" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.552745 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-zj6r5_d016272b-85ec-410a-9392-050c9c0a5ff1/controller/0.log" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.639491 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.884577 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.923976 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.934787 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:55:41 crc kubenswrapper[4691]: I1202 08:55:41.955852 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.162044 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.185944 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.277972 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.285400 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.870218 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-reloader/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.901198 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-frr-files/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.983947 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/cp-metrics/0.log" Dec 02 08:55:42 crc kubenswrapper[4691]: I1202 08:55:42.984971 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/controller/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.119789 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/frr-metrics/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.271252 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/kube-rbac-proxy/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.310608 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/kube-rbac-proxy-frr/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.371464 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/reloader/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.685612 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-7vgpc_14bf1d46-7291-4940-9d53-9361142ad142/frr-k8s-webhook-server/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.910999 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-568f45db7d-dr99j_f921d74e-40cb-430a-a228-ec4681e9251d/manager/0.log" Dec 02 08:55:43 crc kubenswrapper[4691]: I1202 08:55:43.963573 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-656d54dc75-wplwb_0d3a05b7-702e-4564-9014-edffe6fc64ea/webhook-server/0.log" Dec 02 08:55:44 crc kubenswrapper[4691]: I1202 08:55:44.269701 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv29g_defc2342-4163-4319-afaa-fa2eb042082c/kube-rbac-proxy/0.log" Dec 02 08:55:44 crc kubenswrapper[4691]: I1202 08:55:44.943944 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-bv29g_defc2342-4163-4319-afaa-fa2eb042082c/speaker/0.log" Dec 02 08:55:44 crc kubenswrapper[4691]: I1202 08:55:44.966524 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-p9xxd_7790b39b-8f2f-4637-8514-497457777b14/frr/0.log" Dec 02 08:55:56 crc kubenswrapper[4691]: I1202 08:55:56.998624 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/util/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.235807 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/util/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.254516 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/pull/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.280690 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/pull/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.451053 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/util/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.452678 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/extract/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.479172 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212f7rstg_9f953091-57b0-4169-81cc-16a8bbf4a356/pull/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.628958 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/util/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.780600 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/util/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.805160 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/pull/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.815480 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/pull/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.957744 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/util/0.log" Dec 02 08:55:57 crc kubenswrapper[4691]: I1202 08:55:57.961337 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/pull/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.025554 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f835twxk_e067e362-dd94-4d98-83b9-e3108fbdef06/extract/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.127092 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-utilities/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.346178 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-content/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.348134 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-content/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.348326 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-utilities/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.538905 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-content/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.573227 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/extract-utilities/0.log" Dec 02 08:55:58 crc kubenswrapper[4691]: I1202 08:55:58.804164 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-utilities/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.019239 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-utilities/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.044448 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-content/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.122500 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-g6czb_18fe1559-e4cf-4738-ba81-28146b21a37a/registry-server/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.147327 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-content/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.267741 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-content/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.290950 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/extract-utilities/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.663611 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kb4km_411ccc93-fc18-44f5-b96f-f2da874ae9be/marketplace-operator/0.log" Dec 02 08:55:59 crc kubenswrapper[4691]: I1202 08:55:59.780347 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-utilities/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.081905 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-utilities/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.122033 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-content/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.147100 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-content/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.186978 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-k929f_c31c7ca5-195b-41cd-9dee-849169e0fc79/registry-server/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.330258 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-content/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.362478 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/extract-utilities/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.532574 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-f7x42_d26a045c-658e-4950-9d31-98fcc7405794/registry-server/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.555342 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-utilities/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.727402 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-content/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.744874 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-utilities/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.746677 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-content/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.927919 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-utilities/0.log" Dec 02 08:56:00 crc kubenswrapper[4691]: I1202 08:56:00.936705 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/extract-content/0.log" Dec 02 08:56:01 crc kubenswrapper[4691]: I1202 08:56:01.521202 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8hdfj_ee931362-9b98-4d81-b928-7f9bc9810dea/registry-server/0.log" Dec 02 08:57:21 crc kubenswrapper[4691]: I1202 08:57:21.898403 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:57:21 crc kubenswrapper[4691]: I1202 08:57:21.899036 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:57:51 crc kubenswrapper[4691]: I1202 08:57:51.899122 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:57:51 crc kubenswrapper[4691]: I1202 08:57:51.899827 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:57:55 crc kubenswrapper[4691]: I1202 08:57:55.274068 4691 generic.go:334] "Generic (PLEG): container finished" podID="55555f90-c926-494d-8b43-e7099f85c550" containerID="898066b8a1ffe22516b6bda6441cced35f660c3172d46412ffbe772d76ab60ae" exitCode=0 Dec 02 08:57:55 crc kubenswrapper[4691]: I1202 08:57:55.274124 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bxqm7/must-gather-92sjw" event={"ID":"55555f90-c926-494d-8b43-e7099f85c550","Type":"ContainerDied","Data":"898066b8a1ffe22516b6bda6441cced35f660c3172d46412ffbe772d76ab60ae"} Dec 02 08:57:55 crc kubenswrapper[4691]: I1202 08:57:55.275276 4691 scope.go:117] "RemoveContainer" containerID="898066b8a1ffe22516b6bda6441cced35f660c3172d46412ffbe772d76ab60ae" Dec 02 08:57:55 crc kubenswrapper[4691]: I1202 08:57:55.849034 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bxqm7_must-gather-92sjw_55555f90-c926-494d-8b43-e7099f85c550/gather/0.log" Dec 02 08:58:06 crc kubenswrapper[4691]: I1202 08:58:06.169629 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-bxqm7/must-gather-92sjw"] Dec 02 08:58:06 crc kubenswrapper[4691]: I1202 08:58:06.170575 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-bxqm7/must-gather-92sjw" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="copy" containerID="cri-o://54326bcba51fbdc52d86cdbd3e63bfad3baf85845d97507d014d141b9dbb536b" gracePeriod=2 Dec 02 08:58:06 crc kubenswrapper[4691]: I1202 08:58:06.180025 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-bxqm7/must-gather-92sjw"] Dec 02 08:58:06 crc kubenswrapper[4691]: I1202 08:58:06.379918 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bxqm7_must-gather-92sjw_55555f90-c926-494d-8b43-e7099f85c550/copy/0.log" Dec 02 08:58:06 crc kubenswrapper[4691]: I1202 08:58:06.384122 4691 generic.go:334] "Generic (PLEG): container finished" podID="55555f90-c926-494d-8b43-e7099f85c550" containerID="54326bcba51fbdc52d86cdbd3e63bfad3baf85845d97507d014d141b9dbb536b" exitCode=143 Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.016337 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bxqm7_must-gather-92sjw_55555f90-c926-494d-8b43-e7099f85c550/copy/0.log" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.017834 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.125172 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55555f90-c926-494d-8b43-e7099f85c550-must-gather-output\") pod \"55555f90-c926-494d-8b43-e7099f85c550\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.125879 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wxc4\" (UniqueName: \"kubernetes.io/projected/55555f90-c926-494d-8b43-e7099f85c550-kube-api-access-7wxc4\") pod \"55555f90-c926-494d-8b43-e7099f85c550\" (UID: \"55555f90-c926-494d-8b43-e7099f85c550\") " Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.186450 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55555f90-c926-494d-8b43-e7099f85c550-kube-api-access-7wxc4" (OuterVolumeSpecName: "kube-api-access-7wxc4") pod "55555f90-c926-494d-8b43-e7099f85c550" (UID: "55555f90-c926-494d-8b43-e7099f85c550"). InnerVolumeSpecName "kube-api-access-7wxc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.237694 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wxc4\" (UniqueName: \"kubernetes.io/projected/55555f90-c926-494d-8b43-e7099f85c550-kube-api-access-7wxc4\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.294515 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55555f90-c926-494d-8b43-e7099f85c550-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "55555f90-c926-494d-8b43-e7099f85c550" (UID: "55555f90-c926-494d-8b43-e7099f85c550"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.341223 4691 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/55555f90-c926-494d-8b43-e7099f85c550-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.407085 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-bxqm7_must-gather-92sjw_55555f90-c926-494d-8b43-e7099f85c550/copy/0.log" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.407560 4691 scope.go:117] "RemoveContainer" containerID="54326bcba51fbdc52d86cdbd3e63bfad3baf85845d97507d014d141b9dbb536b" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.407618 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bxqm7/must-gather-92sjw" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.430171 4691 scope.go:117] "RemoveContainer" containerID="898066b8a1ffe22516b6bda6441cced35f660c3172d46412ffbe772d76ab60ae" Dec 02 08:58:08 crc kubenswrapper[4691]: I1202 08:58:08.578158 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55555f90-c926-494d-8b43-e7099f85c550" path="/var/lib/kubelet/pods/55555f90-c926-494d-8b43-e7099f85c550/volumes" Dec 02 08:58:21 crc kubenswrapper[4691]: I1202 08:58:21.898822 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 08:58:21 crc kubenswrapper[4691]: I1202 08:58:21.899456 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 08:58:21 crc kubenswrapper[4691]: I1202 08:58:21.899520 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" Dec 02 08:58:21 crc kubenswrapper[4691]: I1202 08:58:21.900580 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14a4380008772d3ff382b5213cbcdff86e0c2cf3ba060c707ae18d2b1fb84fae"} pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 02 08:58:21 crc kubenswrapper[4691]: I1202 08:58:21.900655 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" containerID="cri-o://14a4380008772d3ff382b5213cbcdff86e0c2cf3ba060c707ae18d2b1fb84fae" gracePeriod=600 Dec 02 08:58:22 crc kubenswrapper[4691]: I1202 08:58:22.619104 4691 generic.go:334] "Generic (PLEG): container finished" podID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerID="14a4380008772d3ff382b5213cbcdff86e0c2cf3ba060c707ae18d2b1fb84fae" exitCode=0 Dec 02 08:58:22 crc kubenswrapper[4691]: I1202 08:58:22.619301 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerDied","Data":"14a4380008772d3ff382b5213cbcdff86e0c2cf3ba060c707ae18d2b1fb84fae"} Dec 02 08:58:22 crc kubenswrapper[4691]: I1202 08:58:22.619554 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" event={"ID":"82103e10-1127-4a84-b5fc-9d0d6a259932","Type":"ContainerStarted","Data":"50fbd424c457fb9b9dcc80f407c847339bba3d3cda1bc0f5bb7ba54ea5d7ff94"} Dec 02 08:58:22 crc kubenswrapper[4691]: I1202 08:58:22.619599 4691 scope.go:117] "RemoveContainer" containerID="f4fb912c0d2c76263d770f6ac509563af5f7607e9d08f54c5ebb98fd7d06f8c4" Dec 02 08:58:32 crc kubenswrapper[4691]: I1202 08:58:32.759334 4691 scope.go:117] "RemoveContainer" containerID="55b7349b45b2709c80511d201e41ee6fd13bbbfe1b359c26ed1396ca3822acee" Dec 02 08:59:32 crc kubenswrapper[4691]: I1202 08:59:32.825224 4691 scope.go:117] "RemoveContainer" containerID="0211c7f1354a97d0a0208ddd6b834c8549dd924ab6718c21d756dc039208be75" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.168792 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f"] Dec 02 09:00:00 crc kubenswrapper[4691]: E1202 09:00:00.170005 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="copy" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170022 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="copy" Dec 02 09:00:00 crc kubenswrapper[4691]: E1202 09:00:00.170042 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="registry-server" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170051 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="registry-server" Dec 02 09:00:00 crc kubenswrapper[4691]: E1202 09:00:00.170078 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="gather" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170117 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="gather" Dec 02 09:00:00 crc kubenswrapper[4691]: E1202 09:00:00.170130 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="extract-utilities" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170138 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="extract-utilities" Dec 02 09:00:00 crc kubenswrapper[4691]: E1202 09:00:00.170155 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="extract-content" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170162 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="extract-content" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170407 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="copy" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170431 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="42e035df-9a59-41b7-b30c-29d329c6e871" containerName="registry-server" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.170453 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="55555f90-c926-494d-8b43-e7099f85c550" containerName="gather" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.171456 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.174235 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.174326 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.181654 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f"] Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.311131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860f452b-c40b-4e3a-b46a-bcfb09f1de24-secret-volume\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.311436 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860f452b-c40b-4e3a-b46a-bcfb09f1de24-config-volume\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.311516 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5gfv\" (UniqueName: \"kubernetes.io/projected/860f452b-c40b-4e3a-b46a-bcfb09f1de24-kube-api-access-d5gfv\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.413236 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860f452b-c40b-4e3a-b46a-bcfb09f1de24-config-volume\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.413295 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5gfv\" (UniqueName: \"kubernetes.io/projected/860f452b-c40b-4e3a-b46a-bcfb09f1de24-kube-api-access-d5gfv\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.413937 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860f452b-c40b-4e3a-b46a-bcfb09f1de24-secret-volume\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.415876 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860f452b-c40b-4e3a-b46a-bcfb09f1de24-config-volume\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.420382 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860f452b-c40b-4e3a-b46a-bcfb09f1de24-secret-volume\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.431041 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5gfv\" (UniqueName: \"kubernetes.io/projected/860f452b-c40b-4e3a-b46a-bcfb09f1de24-kube-api-access-d5gfv\") pod \"collect-profiles-29411100-fp48f\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.508819 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:00 crc kubenswrapper[4691]: I1202 09:00:00.962329 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f"] Dec 02 09:00:01 crc kubenswrapper[4691]: I1202 09:00:01.028638 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" event={"ID":"860f452b-c40b-4e3a-b46a-bcfb09f1de24","Type":"ContainerStarted","Data":"aab2f88e26f2bdd724057302a7ab85f3189c47f880e99975a1868a924482e99e"} Dec 02 09:00:02 crc kubenswrapper[4691]: I1202 09:00:02.039890 4691 generic.go:334] "Generic (PLEG): container finished" podID="860f452b-c40b-4e3a-b46a-bcfb09f1de24" containerID="cbf77289354cb375c824154dfbf4c6945e89099bdfef4e5dbf403ad1e06dfaaf" exitCode=0 Dec 02 09:00:02 crc kubenswrapper[4691]: I1202 09:00:02.040003 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" event={"ID":"860f452b-c40b-4e3a-b46a-bcfb09f1de24","Type":"ContainerDied","Data":"cbf77289354cb375c824154dfbf4c6945e89099bdfef4e5dbf403ad1e06dfaaf"} Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.400255 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.495973 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5gfv\" (UniqueName: \"kubernetes.io/projected/860f452b-c40b-4e3a-b46a-bcfb09f1de24-kube-api-access-d5gfv\") pod \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.496036 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860f452b-c40b-4e3a-b46a-bcfb09f1de24-secret-volume\") pod \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.496080 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860f452b-c40b-4e3a-b46a-bcfb09f1de24-config-volume\") pod \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\" (UID: \"860f452b-c40b-4e3a-b46a-bcfb09f1de24\") " Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.497030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/860f452b-c40b-4e3a-b46a-bcfb09f1de24-config-volume" (OuterVolumeSpecName: "config-volume") pod "860f452b-c40b-4e3a-b46a-bcfb09f1de24" (UID: "860f452b-c40b-4e3a-b46a-bcfb09f1de24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.497528 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/860f452b-c40b-4e3a-b46a-bcfb09f1de24-config-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.506118 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/860f452b-c40b-4e3a-b46a-bcfb09f1de24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "860f452b-c40b-4e3a-b46a-bcfb09f1de24" (UID: "860f452b-c40b-4e3a-b46a-bcfb09f1de24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.507753 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860f452b-c40b-4e3a-b46a-bcfb09f1de24-kube-api-access-d5gfv" (OuterVolumeSpecName: "kube-api-access-d5gfv") pod "860f452b-c40b-4e3a-b46a-bcfb09f1de24" (UID: "860f452b-c40b-4e3a-b46a-bcfb09f1de24"). InnerVolumeSpecName "kube-api-access-d5gfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.602098 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5gfv\" (UniqueName: \"kubernetes.io/projected/860f452b-c40b-4e3a-b46a-bcfb09f1de24-kube-api-access-d5gfv\") on node \"crc\" DevicePath \"\"" Dec 02 09:00:03 crc kubenswrapper[4691]: I1202 09:00:03.602547 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/860f452b-c40b-4e3a-b46a-bcfb09f1de24-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 02 09:00:04 crc kubenswrapper[4691]: I1202 09:00:04.059084 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" event={"ID":"860f452b-c40b-4e3a-b46a-bcfb09f1de24","Type":"ContainerDied","Data":"aab2f88e26f2bdd724057302a7ab85f3189c47f880e99975a1868a924482e99e"} Dec 02 09:00:04 crc kubenswrapper[4691]: I1202 09:00:04.059128 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aab2f88e26f2bdd724057302a7ab85f3189c47f880e99975a1868a924482e99e" Dec 02 09:00:04 crc kubenswrapper[4691]: I1202 09:00:04.059140 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29411100-fp48f" Dec 02 09:00:04 crc kubenswrapper[4691]: I1202 09:00:04.479854 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs"] Dec 02 09:00:04 crc kubenswrapper[4691]: I1202 09:00:04.488778 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29411055-56vvs"] Dec 02 09:00:04 crc kubenswrapper[4691]: I1202 09:00:04.575243 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a5baae-3ed8-4d6c-8b6c-81569413503e" path="/var/lib/kubelet/pods/e3a5baae-3ed8-4d6c-8b6c-81569413503e/volumes" Dec 02 09:00:32 crc kubenswrapper[4691]: I1202 09:00:32.893311 4691 scope.go:117] "RemoveContainer" containerID="0299b0430c8051be7f1504692503ce23d299866cdfe54676e3c5ee0236b508c1" Dec 02 09:00:51 crc kubenswrapper[4691]: I1202 09:00:51.899171 4691 patch_prober.go:28] interesting pod/machine-config-daemon-mgbt6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 02 09:00:51 crc kubenswrapper[4691]: I1202 09:00:51.899806 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-mgbt6" podUID="82103e10-1127-4a84-b5fc-9d0d6a259932" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.175961 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29411101-bwlbp"] Dec 02 09:01:00 crc kubenswrapper[4691]: E1202 09:01:00.177144 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860f452b-c40b-4e3a-b46a-bcfb09f1de24" containerName="collect-profiles" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.177162 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="860f452b-c40b-4e3a-b46a-bcfb09f1de24" containerName="collect-profiles" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.177414 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="860f452b-c40b-4e3a-b46a-bcfb09f1de24" containerName="collect-profiles" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.178346 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.193703 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411101-bwlbp"] Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.314556 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-config-data\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.314794 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-fernet-keys\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.314949 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdz65\" (UniqueName: \"kubernetes.io/projected/58b580b1-1191-4d6a-a8c5-63c6ee67b901-kube-api-access-pdz65\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.314975 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-combined-ca-bundle\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.419429 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-config-data\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.419513 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-fernet-keys\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.419611 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdz65\" (UniqueName: \"kubernetes.io/projected/58b580b1-1191-4d6a-a8c5-63c6ee67b901-kube-api-access-pdz65\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.419631 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-combined-ca-bundle\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.426240 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-combined-ca-bundle\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.426388 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-fernet-keys\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.429983 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-config-data\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.441582 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdz65\" (UniqueName: \"kubernetes.io/projected/58b580b1-1191-4d6a-a8c5-63c6ee67b901-kube-api-access-pdz65\") pod \"keystone-cron-29411101-bwlbp\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:00 crc kubenswrapper[4691]: I1202 09:01:00.504164 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:01 crc kubenswrapper[4691]: I1202 09:01:01.578431 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29411101-bwlbp"] Dec 02 09:01:01 crc kubenswrapper[4691]: I1202 09:01:01.691530 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-bwlbp" event={"ID":"58b580b1-1191-4d6a-a8c5-63c6ee67b901","Type":"ContainerStarted","Data":"e33006f3cbd1f4127edd77c6347b382ffdb35bdd052f7bf3927fc3b758afc279"} Dec 02 09:01:02 crc kubenswrapper[4691]: I1202 09:01:02.701724 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-bwlbp" event={"ID":"58b580b1-1191-4d6a-a8c5-63c6ee67b901","Type":"ContainerStarted","Data":"e9654d29a7c62b0fea42071b8c07723948c96b0e4656a1a021e37f601362aa55"} Dec 02 09:01:02 crc kubenswrapper[4691]: I1202 09:01:02.733898 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29411101-bwlbp" podStartSLOduration=2.733878202 podStartE2EDuration="2.733878202s" podCreationTimestamp="2025-12-02 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-02 09:01:02.73141265 +0000 UTC m=+4510.515491512" watchObservedRunningTime="2025-12-02 09:01:02.733878202 +0000 UTC m=+4510.517957064" Dec 02 09:01:05 crc kubenswrapper[4691]: I1202 09:01:05.729574 4691 generic.go:334] "Generic (PLEG): container finished" podID="58b580b1-1191-4d6a-a8c5-63c6ee67b901" containerID="e9654d29a7c62b0fea42071b8c07723948c96b0e4656a1a021e37f601362aa55" exitCode=0 Dec 02 09:01:05 crc kubenswrapper[4691]: I1202 09:01:05.729706 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-bwlbp" event={"ID":"58b580b1-1191-4d6a-a8c5-63c6ee67b901","Type":"ContainerDied","Data":"e9654d29a7c62b0fea42071b8c07723948c96b0e4656a1a021e37f601362aa55"} Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.151588 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-bwlbp" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.222285 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-config-data\") pod \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.222579 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-combined-ca-bundle\") pod \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.222812 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdz65\" (UniqueName: \"kubernetes.io/projected/58b580b1-1191-4d6a-a8c5-63c6ee67b901-kube-api-access-pdz65\") pod \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.222883 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-fernet-keys\") pod \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\" (UID: \"58b580b1-1191-4d6a-a8c5-63c6ee67b901\") " Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.229450 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "58b580b1-1191-4d6a-a8c5-63c6ee67b901" (UID: "58b580b1-1191-4d6a-a8c5-63c6ee67b901"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.230542 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58b580b1-1191-4d6a-a8c5-63c6ee67b901-kube-api-access-pdz65" (OuterVolumeSpecName: "kube-api-access-pdz65") pod "58b580b1-1191-4d6a-a8c5-63c6ee67b901" (UID: "58b580b1-1191-4d6a-a8c5-63c6ee67b901"). InnerVolumeSpecName "kube-api-access-pdz65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.261280 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58b580b1-1191-4d6a-a8c5-63c6ee67b901" (UID: "58b580b1-1191-4d6a-a8c5-63c6ee67b901"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.298620 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-config-data" (OuterVolumeSpecName: "config-data") pod "58b580b1-1191-4d6a-a8c5-63c6ee67b901" (UID: "58b580b1-1191-4d6a-a8c5-63c6ee67b901"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.324793 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.324845 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdz65\" (UniqueName: \"kubernetes.io/projected/58b580b1-1191-4d6a-a8c5-63c6ee67b901-kube-api-access-pdz65\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.324858 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.324869 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/58b580b1-1191-4d6a-a8c5-63c6ee67b901-config-data\") on node \"crc\" DevicePath \"\"" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.753016 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29411101-bwlbp" event={"ID":"58b580b1-1191-4d6a-a8c5-63c6ee67b901","Type":"ContainerDied","Data":"e33006f3cbd1f4127edd77c6347b382ffdb35bdd052f7bf3927fc3b758afc279"} Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.753072 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e33006f3cbd1f4127edd77c6347b382ffdb35bdd052f7bf3927fc3b758afc279" Dec 02 09:01:07 crc kubenswrapper[4691]: I1202 09:01:07.753139 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29411101-bwlbp"